Skip to content
1 change: 1 addition & 0 deletions doc/api/utility.rst
Original file line number Diff line number Diff line change
Expand Up @@ -14,6 +14,7 @@ Utility Classes

utils.DMLDummyRegressor
utils.DMLDummyClassifier
utils.DMLOptunaResult
utils.DoubleMLBLP
utils.DoubleMLPolicyTree
utils.GlobalRegressor
Expand Down
3 changes: 2 additions & 1 deletion doc/examples/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -24,7 +24,8 @@ General Examples
py_double_ml_irm_vs_apo.ipynb
py_double_ml_lplr.ipynb
py_double_ml_ssm.ipynb
py_double_ml_learner.ipynb
learners/py_optuna.ipynb
learners/py_learner.ipynb
py_double_ml_firststage.ipynb
py_double_ml_multiway_cluster.ipynb
py_double_ml_sensitivity_booking.ipynb
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -7,9 +7,10 @@
"source": [
"# Python: Choice of learners\n",
"\n",
"This notebooks contains some practical recommendations to choose the right learner and evaluate different learners for the corresponding nuisance components.\n",
"This notebook contains some practical recommendations to choose the right learner and evaluate different learners for the corresponding nuisance components.\n",
"This notebook mainly highlights the differences in using different learners, i.e. linear or tree-based methods. Generally, we recommend to tune hyperparameters for the chosen learners, see [Example Gallery](https://docs.doubleml.org/stable/examples/index.html).\n",
"\n",
"For the example, we will work with a IRM, but all of the important components are directly usable for all other models too.\n",
"For the example, we will work with a IRM, but all of the important components are directly usable for all other models, too.\n",
"\n",
"To be able to compare the properties of different learners, we will start by setting the true treatment parameter to zero, fix some other parameters of the data generating process and generate several datasets \n",
"to obtain some information about the distribution of the estimators."
Expand Down
Loading
Loading