Skip to content

prasbb/Transformer-PyTorch-Implementation

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

2 Commits
 
 
 
 

Repository files navigation

🧠 Transformer Architecture from Scratch (PyTorch)

This project is a clean, educational implementation of the Transformer architecture (as introduced in Attention is All You Need), built entirely in PyTorch.


Architecture

The implementation follows the classic Transformer design:

  • Input Embedding + Positional Encoding
  • N × Encoder Layers
    • Multi-head Self-Attention
    • Feed-Forward Network
    • Layer Normalization + Residuals
  • N × Decoder Layers
    • Masked Multi-head Self-Attention
    • Encoder-Decoder Attention
    • Feed-Forward Network
  • Final Linear + Softmax Layer

🛠️ Tech Stack

  • Language: Python
  • Framework: PyTorch

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors