Skip to content

Conversation

@JanTeichertKluge
Copy link
Member

@JanTeichertKluge JanTeichertKluge commented Nov 10, 2025

This pull request introduces Optuna-based hyperparameter tuning .

Introduces the new tune_ml_models() methods which allows to flexibly tune hyperparameters for most implemented models. For more details take a look at the documentation.

Generally, we are moving away from fold-specific hyperparameter tuning as hyperparameters should be valid/good for all splits and to simplify the implementation.
The previous tune() method will be deprecated in the future.

PR Checklist

Please fill out this PR checklist (see our contributing guidelines for details).

  • The title of the pull request summarizes the changes made.
  • The PR contains a detailed description of all changes and additions.
  • References to related issues or PRs are added.
  • The code passes all (unit) tests.
  • Enhancements or new feature are equipped with unit tests.
  • The changes adhere to the PEP8 standards.

@SvenKlaassen
Copy link
Member

I just noticed, we should use ensure_all_finite instead of force_all_finite in all tuning operations (see https://scikit-learn.org/stable/modules/generated/sklearn.utils.check_X_y.html)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

new feature new feature

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants