Building a neural network using only numpy to better understand the fundementals of machine learning.
🚀 Features Uses MNIST dataset to train and test data.
Implements: Weight and bias initialization, ReLU activation, Softmax output, Forward propagation, Backward propagation (manual gradient computation), Gradient descent optimizer, Accuracy calculation, Trains on MNIST and tests on a separate dataset.,
📊 Network Architecture
Input layer: 784 neurons (28x28 pixels flattened), Hidden layer: 10 neurons, ReLU activation, Output layer: 10 neurons, Softmax activation,