Skip to content

JXRepo/Transformer-Machine-Translation

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Transformer Machine Translation

This project implements a simple Transformer model using PyTorch for language translation.

Features

• Encoder–decoder Transformer with multi-head attention.

• Positional embeddings and masking (padding + autoregressive).

• Synthetic dataset with digits and letters.

• Training loop and greedy decoding prediction.

Usage

  1. Install dependencies: pip install torch numpy

  2. Train the model: python main.py

  3. Make predictions: Included in main.py after training.

Future Improvements

• Support larger vocabularies and sequences.

• Experiment with more Transformer layers and heads.

• Integrate with real datasets for practical tasks.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages