Skip to content

official code for "Beyond Single-Task: Robust Multi-Task Length Generalization for LLMs" (NeurIPS 2025)

Notifications You must be signed in to change notification settings

MuLabPKU/Meta-RFFT

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 
 
 
 
 

Repository files navigation

Official Code for "Beyond Single-Task: Robust Multi-Task Length Generalization for LLMs"

Welcome to the official repository for our NeurIPS 2025 paper! 🌟 This work introduces Meta-RFFT, a novel approach for enhancing length generalization capabilities in large language models through robust multi-task training.

This is an initial version of our data generation pipeline for Meta-RFFT. We're actively working on the project and will be updating the code to include the full training pipeline soon - stay tuned for more exciting updates!

Method Overview

Meta-RFFT Pipeline
Figure 1: The comprehensive pipeline of our Meta-RFFT framework

Performance Hightlights

Meta-RFFT Performance
Figure 2: Meta-RFFT consistently outperforms baseline methods on Qwen-2.5-7B across various length generalization tasks

Repo Introduction

  • We provide data generation code for our rule-following dataset here. An example data sample in generated in example.ipynb.
  • Besides, the generation script for the synthetic data we use in the in-context learning experiments is included in synthetic_data/int_list.py.

Citation

If you find our work helpful in your research, please consider citing our paper:

@misc{hu2025singletaskrobustmultitasklength,
      title={Beyond Single-Task: Robust Multi-Task Length Generalization for LLMs}, 
      author={Yi Hu and Shijia Kang and Haotong Yang and Haotian Xu and Muhan Zhang},
      year={2025},
      eprint={2502.11525},
      archivePrefix={arXiv},
      primaryClass={cs.CL},
      url={https://arxiv.org/abs/2502.11525}, 
}

About

official code for "Beyond Single-Task: Robust Multi-Task Length Generalization for LLMs" (NeurIPS 2025)

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published