a LLM cookbook, for building your own from scratch, all the way from gathering data to training a model
-
Updated
Jun 25, 2024 - Jupyter Notebook
a LLM cookbook, for building your own from scratch, all the way from gathering data to training a model
This repository features a custom-built decoder-only language model (LLM) with a total of 37 million parameters 🔥. I train the model to be able to ask question from a given context
Experimental project for AI and NLP based on Transformer Architecture
Implementation of the GPT-3 paper: Language Models are Few-Shot Learners
Transformers Intuition
Generate caption on images using CNN Encoder- LSTM Decoder structure
Generative AI fine-tune and inference for sequence classification tasks
An explainable and simplified version of OLMo model
a dna sequence generation/classification using transformers
This project aims to simplify texts from research papers using advanced natural language processing (NLP) techniques, making them more accessible to a broader audience
An LLM based tool for generation of cheese advirtisements
Using LLMs in huggingface for sentiment analysis, translation, summarization and extractive question answering
Coding A Decoder Only Transformer Like ChatGPT From Scratch
A multimodal vision model that takes in an image and a prompt query, and output the answer
A miniGPT inspired from the original NanoGPT released by OpenAI. This is a notebook to walk through the decoder part of the transformer architecture with details outlined.
Decoder model for language modelling
Custom decoder Transformer that treats a patient's medical journey like a story told through diagnosis codes instead of words.
On the Design and Performance of Machine Learning Based Error Correcting Decoders
Text Generation using RNN, LSTM, and Transformer
Add a description, image, and links to the decoder-model topic page so that developers can more easily learn about it.
To associate your repository with the decoder-model topic, visit your repo's landing page and select "manage topics."