A lightweight Python project that visualizes how different optimization algorithms behave when minimizing a function: using a custom-built Polynomial class as the loss function.
This repository evolved from a single experimental script into a modular mini-library (miniopt) for reusable and extensible optimizer simulations.
ND version here https://github.com/youngpark1516/miniopt-nd
The goal of Optimizer Simulations is to demonstrate the intuition behind various gradient-based optimization algorithms such as Gradient Descent, AdaGrad, RMSProp, AdaDelta, and Adam.
The project creates a polynomial function to act as a loss landscape and visualizes how each optimizer updates its guesses to minimize the function value over iterations.
Optimizer_Simulations/
│
├── miniopt/ # Core library (importable as 'miniopt')
│ ├── __init__.py
│ ├── polynomial.py # Custom polynomial class with differentiation
│ ├── optimizers.py # GD, AdaGrad, RMSProp, Adam implementations
│ ├── functions.py # Optional test functions
│ ├── runner.py # Optimization engine
│ └── viz.py # Visualization utilities
│
├── examples/ # Demonstration scripts
│ ├── demo_polynomial.py
│ ├── demo_quartic.py
│ └── compare_optimizers.py
│
├── legacy/ # Original single-file implementation :()
│ ├── simulation.py
│ └── simulation_improved.py
│
├── README.md
├── requirements.txt
└── .gitignore
This project uses three main Python libraries:
A fundamental library for numerical computation, used for:
- Handling coefficients of the polynomial
- Efficiently evaluating and differentiating polynomial functions
A visualization library used for:
- Plotting polynomial curves
- Animating optimizer trajectories
- Displaying convergence over iterations
Used for generating random starting points and randomized polynomial coefficients.
Install dependencies using:
pip install -r requirements.txtThe Polynomial class defines a simple mathematical function that acts as the loss function for optimization.
self.coefficients: NumPy array of coefficients, starting from power ( x^0 ) up to ( x^n ).
You can initialize a polynomial in two ways:
- From coefficients:
p = Polynomial([1, -2, 3]) # 1 - 2x + 3x^2
- Randomized by degree:
p = Polynomial(degree=3)
Internally, the class supports:
__call__: evaluate ( f(x) )derivative(): return a newPolynomialrepresenting ( f'(x) )domain(): generate arrays for plotting- Proper vectorized evaluation for arrays of x-values
Each optimizer updates the parameter ( x ) using its own adaptive learning strategy:
| Optimizer | Description |
|---|---|
| Gradient Descent (GD) | Standard first-order method that updates weights opposite to the gradient direction. |
| AdaGrad | Adapts learning rate per parameter, scaling inversely with accumulated gradient magnitude. |
| RMSProp | Maintains a moving average of squared gradients for smoother adaptive learning rates. |
| AdaDelta | An extension of RMSProp that eliminates the need for a manually selected learning rate. |
| Adam | Combines momentum (first moment) and RMSProp (second moment) into a highly efficient optimizer. |
Each optimizer class implements:
class OptimizerName:
def step(self, x, grad):
...The simulation follows these steps:
- Define a function e.g., a quadratic or cubic polynomial.
- Compute its derivative used as the gradient function.
- Choose an optimizer e.g.,
Adam(lr=0.05). - Run optimization using
run_1d()inminiopt/runner.py. - Visualize the results via
miniopt.viz.
Example (examples/demo_polynomial.py):
from miniopt import Polynomial, Adam, run_1d, viz
p = Polynomial([0, 0, 3]) # f(x) = 3x²
dp = p.derivative()
opt = Adam(lr=0.1)
history = run_1d(f=p, f_prime=dp, optimizer=opt, x0=2.5, steps=50)
viz.plot_path_1d(p, history, x_min=-3, x_max=3, title="Adam on 3x²")miniopt.viz provides:
- Function plot: visualize the shape of the polynomial
- Path plot: show optimizer trajectory step-by-step
- Convergence plot: plot ( f(x_t) ) vs iteration
Run examples from the project root:
python -m examples.demo_polynomialOr compare optimizers:
python -m examples.compare_optimizers- Extend to multi-dimensional optimization
- Add momentum, NAdam, and LAMB optimizers
- Integrate interactive visualizations with Plotly
- Add unit tests for optimizer behavior and convergence
Man just take it
Chanyoung Park