This repository holds the official code for the manuscript "Online Bilevel Optimization: Regret Analysis of Online Alternating Gradient Methods".
This paper introduces an online bilevel optimization setting in which a sequence of time-varying bilevel problems are revealed one after the other. We extend the known regret bounds for single-level online algorithms to the bilevel setting. Specifically, we provide new notions of bilevel regret, develop an online alternating time-averaged gradient method that is capable of leveraging smoothness, and give regret bounds in terms of the path-length of the inner and outer minimizer sequences.
Before running the code, we need to deploy the environment. A recommended way is to use conda to create the environment and install the related packages shown as follows.
conda create -n OAGD python=3.9
pip install -r requirements.txt
conda activate OAGD To run the code for either Hyperparameter-Optimization or Meta-Learning, please use
python main.pywith specific arguments in each folder.
Our hyperparameter-optimization code is implemented based on AutoBalance:
https://github.com/ucr-optml/AutoBalance.
Our meta-learning code is implemented based on iMAML:
https://github.com/prolearner/hypertorch/blob/master/examples/iMAML.py.