-
Notifications
You must be signed in to change notification settings - Fork 22
Open
Description
Hi, all,
In train_partseg.py, lr_scheduler is applied. However, this argument is assigned with value around 14,000 (for ShapeNet Part dataset), which is passed to this line:
lr_decay = self.lr_decay ** int(step / self.decay_step),
And self.decay_step is initialized as 15,000, so lr_decay is always be self.lr_decay ** 0 ->1.
Therefore, self.optimizer.lr is always unchanged from this line.
self.optimizer.lr = lr_decay * self.basic_lr.
Is something wrong in LRScheduler in utils.py?
Please correct me if I misunderstand the code.
Thanks~
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
No labels