Skip to content
eander2 edited this page Dec 27, 2017 · 2 revisions

Simple neural network implementation for testing missing data embedding

This code is intended to be a test platform for algorithms that are capable of naturally handling missing data in neural networks. Currently only fully connected layers are supported; since initial testing is promising I will add convolutional layers as well. All linear algebra is implemented using Numpy to make modifications to standard back-propagation clear.

The simple implementation of the SGD optimizer here is not vectorized so keep that in mind when checking tensor sizes and aggregates batches during backprop. The testing the input layer implements for handling missing data is not trivial to vectorize so that will be implemented layer.

Clone this wiki locally