Skip to content

Treat default values in function signature the same way as config kwargs#12

Open
JIy3AHKO wants to merge 2 commits intoPositronic-Robotics:mainfrom
JIy3AHKO:fetch-defaults-from-spec
Open

Treat default values in function signature the same way as config kwargs#12
JIy3AHKO wants to merge 2 commits intoPositronic-Robotics:mainfrom
JIy3AHKO:fetch-defaults-from-spec

Conversation

@JIy3AHKO
Copy link
Copy Markdown
Contributor

@JIy3AHKO JIy3AHKO commented Aug 5, 2025

As mentioned in #10 cfn.Config objects in function's default value will not be resolved during instantiate.

After this PR it would be possible to declare every default value in function signature e.g.:

@cfn.config()
def train(model=cfg.models.vit, optimizer=cfg.optimizers.adam, dataset=cfg.datasets.imagenet1k):
    ...

@JIy3AHKO JIy3AHKO requested a review from vertix August 5, 2025 15:01
@JIy3AHKO JIy3AHKO force-pushed the fetch-defaults-from-spec branch from 9ad40a1 to cdaa6c1 Compare August 5, 2025 15:02
@JIy3AHKO JIy3AHKO linked an issue Aug 5, 2025 that may be closed by this pull request
def return1():
return 1

@cfn.config()
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Does this PR mean that there's no need to have cfn.config decorator as a function (aka decorator factory) and we can just make it a usual decorator?

return 1

@cfn.config()
def add1(a=return1):
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can we imagine the non-config-decorated functions that take configs as arguments?

What I want to say is that I don't like passing configs as defaults to functions. I would propose to pass configs as defaults only to config decorators. Otherwise, the reader can really be confused, as the real default is not the config, but its instantiation.

@cfn.config(batch_size=32, lr=0.001)
def create_optimizer(batch_size: int, lr: float):
return torch.optim.Adam(lr=lr)
@cfn.config(lr=1e-3, weight_decay=0)
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think we need to expand the other sections (particularly resolution and relative resolution) to document this new functionality

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Support resolution of configs defined in function defaults

2 participants