A compilation of codes and resources I used during my time in class CS 449: Computational Learning.
PerceptronBasic.ipynb: Perceptron algorithm implimented from scratch on linear data using margin of error instead of learning rate and accuracy limit. Includes an output of the weight vector, margin, and update errors for each weight adjustment.
PerceptronLR.ipynb: Perceptron algorithm implimented from scratch on non-linear data using learning rate, forward inferrence and backwards propagation.
NN_with_tanh().ipynb: A neural network boolean classifier with 1 hidden layer implimented from scratch with a 95% accuracy on training data. Has 22 out of 100 misclassifications on testing data.
ApplyingAdaboost.ipynb: Trained 3 weak learners (decision tree, logistics regression, and perceptron) on the MNIST Digits Dataset with ADABoost and compared training error with boosting rounds.
Collaborated with Sophia Culver (culversa@clarkson.edu)