Skip to content

Save and load optimizer state in bilby #5

@ihh

Description

@ihh

Currently, bilby auto-saves parameters during a training run and loads them back in if available on restart. However, it should also save the state of the optimizer. The Adam algorithm uses the concept of momentum from past gradient steps. This is currently lost when a training run is interrupted. It should be possible to serialize and deserialize the optax adam object that represents the state of this algorithm. This should then be autoloaded when bilby is started, if it exists. There should also be an option on the command line to disable this, or to specify the path to the serialized adam state.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions