Skip to content

An implementation of a neural network training routine using derivative information in Pytorch.

License

Notifications You must be signed in to change notification settings

FuhgJan/SobolevPytorch

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

4 Commits
 
 
 
 
 
 

Repository files navigation

SobolevPytorch

An implementation of a neural network training routine using available derivative information in Pytorch.

Original paper:

Czarnecki, W. M., Osindero, S., Jaderberg, M., Swirszcz, G., & Pascanu, R. (2017). Sobolev training for neural networks. In Advances in Neural Information Processing Systems (pp. 4278-4287).

Using Sobolev training we can efficiently reduce the overall loss and are able to get better approximations of the derivatives of the inputs.

Tested on

  • Python 3.8
  • Numpy 1.19.4
  • Pytorch 1.7.0
  • Matplotlib 3.1.2

Example

Test on Franke's function

Training on 100 equidistant points between 0 and 1 yields the following convergence behavior:

Normalized convergence plot

Testing on 1600 test points in the parametric space yields the following visualized results which are in good accordance with the target.

Normalized convergence plot

About

An implementation of a neural network training routine using derivative information in Pytorch.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages