Skip to content

Training loss stuck around 0.4 #22

@zzzyzh

Description

@zzzyzh

I'm using your code for training, but I noticed that the loss doesn't decrease further after a certain point. During training, the loss initially drops to around 0.4, but then it just fluctuates around that value and doesn't go lower. Is this expected behavior? Or could it indicate a potential issue with the training setup?

Any advice would be appreciated. Thanks in advance!

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions