Skip to content

The paramters in optimizer #28

@lzyhha

Description

@lzyhha

Hello, I note that the order of paramters (params lr wd) in PolyOptimizer is different from official SGD(params lr momentum). So I think the value of wd will actually be assigned to momentum. Is it so?

class PolyOptimizer(torch.optim.SGD):

    def __init__(self, params, lr, weight_decay, max_step, momentum=0.9):
        super().__init__(params, lr, weight_decay)

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions