Skip to content

Comments

RSO#30

Open
MichielStock wants to merge 25 commits intoMichielStock:masterfrom
HeesooSong:master
Open

RSO#30
MichielStock wants to merge 25 commits intoMichielStock:masterfrom
HeesooSong:master

Conversation

@MichielStock
Copy link
Owner

No description provided.

@MichielStock
Copy link
Owner Author

I think your code is not very generic or clear. Don't be afraid to slim it down to the most simple non-trivial case, it is a research example after all! I added a toy example of using a two-layer ANN to perform regression. It makes sense to take it from here?

@jstaut
Copy link

jstaut commented Jan 31, 2022

Review suggestions (Jasper Staut)

README
Why would gradient-free be better and what would be the arguments for choosing one or the other approach (pro's and con's of updating one at a time?)

reproduce "the" RSO function with "a" [...] to "the" back propagation method

0 introduction
use "\leq" instead of "<="

reproduce "the" RSO function [...] accuracy to "the classical back propagation method". [...] One convolutional "layer"

1-1 Parameters
A weight in wid -> $w_{i_d}$

2-3 Result
Maybe more explicitly state that you ran those models and that it is not a table from the paper

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants