Skip to content

buaa-colalab/AEOSBench

Repository files navigation

Constellation

This repository is the official implementation of "Towards Realistic Earth-Observation Constellation Scheduling: Benchmark and Methodology".

NeurIPS 2025 arXiv

Installation

sudo apt install ffmpeg libpq-dev
bash setup.sh

data

If you want to use all data to reproduct our paper:

git clone git@hf.co:datasets/MessianX/AEOS-dataset ./data
find ./data -type f -name '*.tar' -print0 | xargs -0 -n1 -I{} sh -c 'tar -xf "$1" -C "$(dirname "$1")"' _ {}

Or, you can just download the val_seen/val_unseen/test from the trajectories inside our hf repo and unzip them to only evaluate your own model:

# TODO: urls
# suppose you have download these requested data
find ./data -type f -name '*.tar' -print0 | xargs -0 -n1 -I{} sh -c 'tar -xf "$1" -C "$(dirname "$1")"' _ {}

Steps

1. Confirm the data

The right file tree should be like this:

data/
├── trajectories.1/
│   ├── test/
│   ├── train/
│   │   ├── 00/         # contains pth and json files
│   │   ├── 01/
│   │   ├── ...
│   ├── val_seen/
│   └── val_unseen/
├── trajectories.2/
├── trajectories.3/
├── annotations/
│   ├── test.json
│   ├── train.json
│   ├── val_seen.json
│   └── val_unseen.json
├── constellations/
│   ├── test/
│   ├── train/
│   ├── val_seen/
│   └── val_unseen/
├── orbits/
├── satellites/
└── tasksets/

2. Train the model

Use the command below to train our model:

CUDA_VISIBLE_DEVICES=0 PYTHONPATH=:${PYTHONPATH} auto_torchrun -m constellation.new_transformers.train test constellation/new_transformers/config.py

This will continue till 200000 iters.

3. Eval the model

Use the command below to evaluate the model:

CUDA_VISIBLE_DEVICES=0 WORLD_SIZE=1 RANK=0 python -m constellation.rl.eval_all \
    work_dir_name \
    constellation/rl/config_eval.py \
    --load-model-from 'work_dirs/test/checkpoints/iter_100000/model.pth'

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 3

  •  
  •  
  •  

Languages