This repository contains 10 core Artificial Neural Networks (ANN) experiments performed as part of the 3rd-year AIDS curriculum. The goal is to provide a structured and beginner-friendly reference for students looking to understand and implement foundational ANN concepts using Python and relevant libraries.
Each experiment folder contains:
- Source code (with comments)
- Sample output screenshots (if available)
- Brief explanations and instructions to run the code
The experiments cover a variety of ANN topics including perceptrons, backpropagation, activation functions, and practical model training using datasets.
- Perceptron Algorithm Implementation
- Backpropagation Algorithm
- McCulloch-Pitts Neuron Model
- AND, OR, XOR Gate using Neural Networks
- Activation Functions (Sigmoid, Tanh, ReLU)
- Gradient Descent Implementation
- Feedforward Neural Network
- ANN for Classification (Iris Dataset)
- ANN for Regression (Custom Dataset)
- MNIST Digit Recognition using ANN
- Clone the repository:
git clone https://github.com/Akshint0407/ANN-Lab-Experiments.git
- Make sure Jupyter Notebook is installed. If not, install it using:
pip install notebook
- Launch Jupyter Notebook:
jupyter notebook
- Navigate to the experiment folder and open the .ipynb file of your choice.
Ensure required libraries like NumPy, Pandas, Matplotlib, Scikit-learn, and TensorFlow/Keras are installed in your environment.
-
3rd Year AIDS Students
-
Beginners in Neural Networks
-
Anyone exploring foundational AI/ML concepts
If you have improvements, bug fixes, or additional experiment ideas, feel free to fork this repo and raise a pull request!
This project is licensed under the MIT License.
Feel free to share with your classmates and juniors. Happy Learning!