Skip to content

Include freezing, adding layers and adjusting the learning rate in transfer learning models #8

@JulianZabbarov

Description

@JulianZabbarov

It is common in the transfer learning approach to freeze and add layers when training on the target dataset. Also, a fine-tuning step with unfrozen layers is a common way to adjust the model to the final dataset. SimbaML should be able to support this.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions