Skip to content

refactor: refinements in utils and associated functions#121

Closed
neuralsorcerer wants to merge 1 commit intofacebookresearch:mainfrom
neuralsorcerer:refactor-util
Closed

refactor: refinements in utils and associated functions#121
neuralsorcerer wants to merge 1 commit intofacebookresearch:mainfrom
neuralsorcerer:refactor-util

Conversation

@neuralsorcerer
Copy link
Contributor

Changes:

  • Replaced activation and loss dictionaries with Enum classes ActivationType and LossType for type safety.
  • Replaced the old initialization helper with a clearer xavier_init_weights function for linear layers.
  • Neural linear regression selects its output activation via ActivationType for clarity.
  • Rewrote the normalized Softplus activation to avoid redundant computations and added a default Softmax dimension for the activation map.
  • Fixed the convolution block to apply batch normalization using the correct output channel size.
  • Neural bandit classes accept LossType parameters and call loss_type.function() when training.

Why these changes?

  • Mostly mentioned in TODOs in utils.py i.e. (Enums and xavier_init_weights), so decided to refactor according to it.
  • MLP construction now instantiates activations through the enums, reducing string lookups.

@meta-cla meta-cla bot added the CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. label Jul 18, 2025
@facebook-github-bot
Copy link
Contributor

@rodrigodesalvobraz has imported this pull request. If you are a Meta employee, you can view this in D78998902.

@facebook-github-bot
Copy link
Contributor

@ayushj240 merged this pull request in 9de0995.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. Merged

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants