Skip to content

Latest commit

 

History

History
12 lines (8 loc) · 463 Bytes

File metadata and controls

12 lines (8 loc) · 463 Bytes

Implementing Backpropagation in My Deep Learning Framework

Built my own deep learning framework (e.g Tensorflow/PyTorch) to help me learn the basics better.

Implemented backprop and calculated gradients by myself. Results are comparable to using pytorch and tensorflow after training.

  • Supports Relu and Sigmoid activation functions
  • Supports BinaryCrossEntropy and Mean Squared Error Loss functions

Bonus

  • implemented dynamic learning rate adjustment