Efficient encoder-decoder architecture for small language models (≤1B parameters) with cross-architecture knowledge distillation and vision-language capabilities
-
Updated
Feb 7, 2025 - Python
Efficient encoder-decoder architecture for small language models (≤1B parameters) with cross-architecture knowledge distillation and vision-language capabilities
使用Decoder-only的Transformer进行时序预测,包含SwiGLU和RoPE(Rotary Positional Embedding),Time series prediction using Decoder-only Transformer, Including SwiGLU and RoPE(Rotary Positional Embedding)
🔍 Multilingual Evaluation of English-Centric LLMs via Cross-Lingual Alignment
Code for paper "Modality Plug-and-Play: Elastic Modality Adaptation in Multimodal LLMs for Embodied AI"
ViAG: A Novel Framework for Fine-tuning Answer Generation models ultilizing Encoder-Decoder and Decoder-only Transformers's architecture
A mini version of GPT implemented on shakespear using BPE
🧸 A fully custom GPT-style language model built from scratch using PyTorch and trained on Winnie-the-Pooh! Explored the core mechanics of self-attention, autoregressive text generation, and modular model training, all without relying on any external libraries.
in dev ...
This repository contains the implementation and experiments for comparing gradual growth methods, specifically the G_stack approach, with naive models trained from scratch. The project focuses on addressing catastrophic forgetting and improving model performance in continuous learning scenarios.
This study examines the effectiveness of transformer-based models for financial time series forecasting, specifically focusing on log returns derived from daily closing prices of the DAX40 index. We propose a decoder-only transformer model designed for immediate-term financial time series forecasting: The PatternDecoder.
A decoder only approach for image reconstruction inspired by adversarial machine learning implemented in keras/tensorflow2
Auto regressive text generation application using decoder transformer
Decoder-only transformer, simplest character-level tokenization, training and text generation.
Decoder-only transfomer model for answering short questions using causal self-attention.
Add a description, image, and links to the decoder-only topic page so that developers can more easily learn about it.
To associate your repository with the decoder-only topic, visit your repo's landing page and select "manage topics."