-
Notifications
You must be signed in to change notification settings - Fork 79
Implement SimState.wrap_positions using pbc_wrap_batched #350
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
|
@thomasloux Could you please review this? I created a new branch without the test since my previous commits got messy after other changes in my fork. So I opened a clean PR. |
|
Could you link the other PR for context? |
|
Previous PR: #345 |
torch_sim/state.py
Outdated
| """ | ||
| # TODO: implement a wrapping method | ||
| return self.positions | ||
| if not self.pbc: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
now self.pbc is a Boolean (3,) shaped Tensor. So this expression won't work here. I don't think this piece of code should need a test, a static typing check would have catched that. replace with if not self.pbc.any()
|
For reference of use of static check #339 |
I have addressed the blocking comment on this abandoned PR
thomasloux
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Not sure the tests are super necessary, but good for me.
Summary
Checklist
Before a pull request can be merged, the following items must be checked:
Resolves #343