Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
52 changes: 20 additions & 32 deletions docs/source/algorithms.md
Original file line number Diff line number Diff line change
Expand Up @@ -3920,48 +3920,36 @@ addition to optimagic when using an NLOPT algorithm. To install nlopt run

optimagic supports the [IMINUIT MIGRAD Optimizer](https://iminuit.readthedocs.io/). To
use MIGRAD, you need to have
[the iminuit package](https://github.com/scikit-hep/iminuit) installed (pip install
iminuit).
[the iminuit package](https://github.com/scikit-hep/iminuit) installed
(`pip install iminuit`).

```{eval-rst}
.. dropdown:: iminuit_migrad

.. code-block::

"iminuit_migrad"

`MIGRAD <https://iminuit.readthedocs.io/en/stable/reference.html#iminuit.Minuit.migrad>`_ is
the workhorse algorithm of the MINUIT optimization suite, which has been widely used in the
high-energy physics community since 1975. The IMINUIT package is a Python interface to the
Minuit2 C++ library developed by CERN.
**How to use this algorithm:**

Migrad uses a quasi-Newton method, updating the Hessian matrix iteratively
to guide the optimization. The algorithm adapts dynamically to challenging landscapes
using several key techniques:
.. code-block::

- **Quasi-Newton updates**: The Hessian is updated iteratively rather than recalculated at
each step, improving efficiency.
- **Steepest descent fallback**: When the Hessian update fails, Migrad falls back to steepest
descent with line search.
- **Box constraints handling**: Parameters with bounds are transformed internally to ensure
they remain within allowed limits.
- **Heuristics for numerical stability**: Special cases such as flat gradients or singular
Hessians are managed using pre-defined heuristics.
- **Stopping criteria based on Estimated Distance to Minimum (EDM)**: The optimization halts
when the predicted improvement becomes sufficiently small.

For details see :cite:`JAMES1975343`.
import optimagic as om
om.minimize(
...,
algorithm=om.algos.iminuit_migrad(stopping_maxfun=10_000, ...)
)

or

.. code-block::

**Optimizer Parameters:**
om.minimize(
...,
algorithm="iminuit_migrad",
algo_options={"stopping_maxfun=10_000, ...}
)

- **stopping.maxfun** (int): Maximum number of function evaluations. If reached, the optimization stops
but this is not counted as successful convergence. Function evaluations used for numerical gradient
calculations do not count toward this limit. Default is 1,000,000.
**Description and available options:**

- **n_restarts** (int): Number of times to restart the optimizer if convergence is not reached.
.. autoclass:: optimagic.optimizers.iminuit_migrad.IminuitMigrad

- A value of 1 (the default) indicates that the optimizer will only run once, disabling the restart feature.
- Values greater than 1 specify the maximum number of restart attempts.
```

## Nevergrad Optimizers
Expand Down
36 changes: 36 additions & 0 deletions src/optimagic/optimizers/iminuit_migrad.py
Original file line number Diff line number Diff line change
Expand Up @@ -42,8 +42,44 @@
)
@dataclass(frozen=True)
class IminuitMigrad(Algorithm):
"""Minimize a scalar differentiable function using the MIGRAD algorithm from
iminuit.

This optimizer wraps the MIGRAD algorithm from the iminuit package, which provides a
Python interface to the Minuit2 C++ library developed and maintained by CERN.

MIGRAD is a local optimization method in the quasi-Newton family. It iteratively
builds an approximation of the inverse Hessian matrix using the DFP variable-metric
method to efficiently navigate optimization landscapes.

At each iteration, the algorithm attempts a Newton step, using gradient and Hessian
approximations to move toward the function’s minimum. If this step fails to reduce
the objective function, MIGRAD conducts a line search along the gradient direction
to maintain progress. This continues until the convergence criteria, such as the
Estimated Distance to Minimum (EDM) are met, that is, they fall below preset
thresholds.

MIGRAD is designed for statistical optimization problems where accurate parameter
uncertainty estimates are essential. It excels at maximum-likelihood and least-
squares fits common in scientific computing, and is best suited for smooth,
differentiable cost functions.

For best performance, supply analytical gradients. Convergence and solution will
depend on your starting values. Bound constraints (limits) supported.

"""

stopping_maxfun: int = STOPPING_MAXFUN
"""Maximum number of function evaluations."""

n_restarts: int = N_RESTARTS
"""Number of times to restart the optimizer if convergence is not reached.

A value of 1 (the default) indicates that the optimizer will only run once,
disabling the restart feature. Values greater than 1 specify the maximum number of
restart attempts.

"""

def _solve_internal_problem(
self, problem: InternalOptimizationProblem, params: NDArray[np.float64]
Expand Down
Loading