Skip to content

hsiaocy/hw2

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

9 Commits
 
 
 
 

Repository files navigation

Lecture 11040A

homework 2

Given an 1-M-1 multi-layer perceptron (MLP) which input-ouput are denoted by x(t), y(t), t=1,2,...,N and the node function are f1(x)=tanh(x) for hidden layer, f2(x)=x for output layer.

  • Suppose there is a teacher signal, d(t), t=1,2,...,N, corresponding the input x(t). Please derive a backpropagation (BP) algorithm when a cost function is defined by:

    cost_function

  • Train the MLP using the BP algorithm in the case of M=2, N=1, x(1)=0.8, d(1)=0.72, and 0.3 learning rate.

About

Waseda IPS Lecture-11040A<Neural Networks> Homework 2

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages