Conversation
Bench low dim
Th0ught09
left a comment
There was a problem hiding this comment.
A lot of comments but I'm not sure if they're intentional or were left in as a mistake. Not too sure of the general purpose of the pr par for adding some better output and formatting
| from smlp_py.train_caret import ModelCaret | ||
| from smlp_py.train_sklearn import ModelSklearn | ||
| from smlp_py.smlp_utils import str_to_bool | ||
| #from smlp_py.smlp_utils import str_to_bool, model_features_sanity_check |
There was a problem hiding this comment.
best to delete comment
| #from smlp_py.smlp_utils import str_to_bool, model_features_sanity_check |
| else: | ||
| self._keras_logger.info('dense layer of size ' + str(size)) | ||
| model.add(keras.layers.Dense(units=size, activation=hid_activation)) | ||
| #model.add(keras.layers.Dropout(rate=0.2)) |
There was a problem hiding this comment.
I see a lot of these lines added only to be commented out, is this a mistake or intentional?
There was a problem hiding this comment.
If the PDF documentation says nothing about this kind of functionality and there are no comments explaining it, I'd remove this commented code.
There was a problem hiding this comment.
I'd done this in my previous changes (that have now gone into my separate branch) as there are hundreds of these lines. I'd prefer to leave them in here for now and then when a style guideline is pushed I'll put a new pr with all the commented stuff deleted
This PR is a full branch that Neel was working (he was not making separate PRs which is not good but we have what we have). The branch should contain benchmarks that you can use (see reports I emailed) but we should make sure that merging this branch makes sense. |
Fair enough, could I have a go at cleaning it up a bit as I think the comments will lead to debt down the line? |
Please go ahead! |
|
I've done all the adjustements I think are worth the time, there's a lot more to do but I'd like to do more towards my project. The diffs are fairly large as my vim config auotmatically runs a python formatter |
|
@Th0ught09 thanks!
|
I suggest not mixing functional changes wirh formatting changes, especially when there is no agreed-upon format, yet. Please try to keep purely syntactic changes out of PRs introducing semantic ones. |
Forgot to mention, it runs well on my end! |
I may not have been clear sorry, the purpose of this was formatting changes (pep8 as a ref) as a lot of the code had dangling comment lines that I thought would wind up with technical debt. The IDE I use does this automatically when I save a file so a bit hard for me to avoid it sorry! |
It's pretty easy disable a plugin or auto-formatting script in vim. As you know, we've not decided on a consistent style, yet, so there's no point in everyone producing large diffs because of some random auto-formatting they've configured locally. It will also make merges of other PRs harder, which is why those large "cleanup" changes of entire source files should be done carefully, file per file, as to not block changes from other people. |
fair enough, I'll try rebase the mr :) |
2fbaa8d to
ec1cc2b
Compare
|
Rebased to before I touched it, I merged my previous changes into my |
Many thanks!
I believe, the README has, initially in this PR, been rewritten entirely to only document the changes of this PR. Ideally, there should be no lines deleted in the README. Could I ask you to take another look at it wrt. the formatting and possibly revert changes to it unless they're necessary? |
Missed these ones sorry, all deleted lines should be restored :)
They aren't necessary, but I'm not sure about restoring some of the tabs that look identical, in the main page the |
No description provided.