We are going to build a neural network framework called LiFT. LiFT stands for LIsp-Flavoured Tensor, following the trend of LFE in naming. As pointed out by Phil Karlton, there are only two hard things in Computer Science: cache invalidation and naming things. LFE has made naming a lot easier. If you are working on something inspired by X and Y, and cannot think of a good name, just call it X Flavoured Y, problem solved.
We show throughout this tutorial, by adopting rank of function from J and piggybacking on integer set library, a neural network framework could be built with much less effort.
LiFT can be served as a good starting point of your full-fledged neural network framework. And LiFT does not limit itself to neural network, it could also be a good choice if you need tensor and gradient based optimization.
- a simple example with autograd
- tensor and rank
- automatic differentiation
- shape checking
- backpropagation
- compilation with integer set library
- array contraction
- convolution
If you have any questions, find any mistakes or have any suggestions, feel free to open an issue or send a pull request.