Skip to content

a question about auto grad. thx #11

@yebangyu

Description

@yebangyu

Dear Edward,

From page 21 to page 23, when we are talking about auto grad,

we choose to test conditon ||prev - cur || < epsilon satisfies or not to check whether we have got the minimun

my question is : why not just to test whether the grad of cur is zero or not ?

that is to say :

can

while torch.linalg.norm(x_cur-x_prev) > epsilon:

be replaced by

epsilon = 1e-12 # an enough small value

while abs(cur.grad) > epsilon:

?

thanks a lot !

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions