Skip to content

BojianHou/OAGD

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

22 Commits
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

Online Bilevel Optimization: Regret Analysis of Online Alternating Gradient Methods

This repository holds the official code for the manuscript "Online Bilevel Optimization: Regret Analysis of Online Alternating Gradient Methods".

πŸ¦Έβ€ Abstract

This paper introduces an online bilevel optimization setting in which a sequence of time-varying bilevel problems are revealed one after the other. We extend the known regret bounds for single-level online algorithms to the bilevel setting. Specifically, we provide new notions of bilevel regret, develop an online alternating time-averaged gradient method that is capable of leveraging smoothness, and give regret bounds in terms of the path-length of the inner and outer minimizer sequences.

πŸ“ Requirements

Before running the code, we need to deploy the environment. A recommended way is to use conda to create the environment and install the related packages shown as follows.

conda create -n OAGD python=3.9
pip install -r requirements.txt
conda activate OAGD 

πŸ”¨ Usage

To run the code for either Hyperparameter-Optimization or Meta-Learning, please use

python main.py

with specific arguments in each folder.

πŸ“– Reference

Our hyperparameter-optimization code is implemented based on AutoBalance:

https://github.com/ucr-optml/AutoBalance.

Our meta-learning code is implemented based on iMAML:

https://github.com/prolearner/hypertorch/blob/master/examples/iMAML.py.

πŸ“­ Maintainers

Bojian Hou

About

Repository for Online Bilevel Optimization

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages