A repository for a few low-level AI models, including a language model, machine learning algorithms, and neural networks.
This repository contains several AI projects implemented from scratch to explore their fundamental principles.
This project is a Python-based implementation of a classic N-Gram language model. It is designed to predict the next word in a sequence by calculating the probability of a word given the previous n-1 words.
The model is capable of building Unigram, Bigram, and Trigram probability matrices from a text corpus and uses this knowledge to generate new text. It also includes functions to evaluate model performance using perplexity. The entire project was built using Python and NumPy to demonstrate the fundamentals of probabilistic language modeling.
For more details, see the project's directory: ngram_language_models/.
(Work in progress. This section will contain classic machine learning models.)
(Work in progress. This section will feature implementations of various neural network architectures.)