Skip to content

From scratch NumPy implementation of a 2-layer neural network trained on the MNIST handwritten digit dataset, with manual forward/backward propagation and gradient descent.

Notifications You must be signed in to change notification settings

FuriousFist/MNIST-Neural-Network

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

15 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

MNIST-Neural-Network

Building a neural network using only numpy to better understand the fundementals of machine learning.

🚀 Features Uses MNIST dataset to train and test data.

Implements: Weight and bias initialization, ReLU activation, Softmax output, Forward propagation, Backward propagation (manual gradient computation), Gradient descent optimizer, Accuracy calculation, Trains on MNIST and tests on a separate dataset.,

📊 Network Architecture

Input layer: 784 neurons (28x28 pixels flattened), Hidden layer: 10 neurons, ReLU activation, Output layer: 10 neurons, Softmax activation,

About

From scratch NumPy implementation of a 2-layer neural network trained on the MNIST handwritten digit dataset, with manual forward/backward propagation and gradient descent.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published