Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[feat] OSS: Sync all attributes #67

Merged
merged 3 commits into from
Sep 8, 2020
Merged

Conversation

blefaudeux
Copy link
Contributor

@blefaudeux blefaudeux commented Sep 5, 2020

Before submitting

  • Was this discussed/approved via a Github issue? (no need for typos, doc improvements)
  • Did you read the contributor guideline?
  • Did you make sure to update the docs?
  • Did you write any new necessary tests? (lr scheduler test)

What does this PR do?

Fixes #66

PR review

Anyone in the community is free to review the PR once the tests have passed.
If we didn't discuss your PR in Github issues there's a high chance it will not be merged.

Did you have fun?

Make sure you had fun coding 🙃

@facebook-github-bot facebook-github-bot added the CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. label Sep 5, 2020
@blefaudeux
Copy link
Contributor Author

cc @mannatsingh

@blefaudeux blefaudeux changed the title Oss sync all attributes [feat] OSS: Sync all attributes Sep 5, 2020
Copy link
Contributor

@msbaines msbaines left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please add a test.

Does this fix a bug? The only attribute that is common to all optimizers is learning rate.

@blefaudeux
Copy link
Contributor Author

Please add a test.

Does this fix a bug? The only attribute that is common to all optimizers is learning rate.

see for instance https://github.com/facebookresearch/ClassyVision/blob/master/classy_vision/optim/classy_optimizer.py#L162
in classy the user is free to pass any callback that he/she wants to adjust any setting over time, can be LR but can also be something else, the "lr" key is not specific as you can see in the code snippet. Not syncing them means silent failure for the end user when using oss, not very nice.

I'll add a non-lr unit test, good point, I was a bit lazy and thought that lr could be enough but that's wrong

@msbaines
Copy link
Contributor

msbaines commented Sep 8, 2020

Please add a test.
Does this fix a bug? The only attribute that is common to all optimizers is learning rate.

see for instance https://github.com/facebookresearch/ClassyVision/blob/master/classy_vision/optim/classy_optimizer.py#L162
in classy the user is free to pass any callback that he/she wants to adjust any setting over time, can be LR but can also be something else, the "lr" key is not specific as you can see in the code snippet. Not syncing them means silent failure for the end user when using oss, not very nice.

I'll add a non-lr unit test, good point, I was a bit lazy and thought that lr could be enough but that's wrong

It would be nice to capture some of this discussion in the commit message or via a code comment.

@blefaudeux
Copy link
Contributor Author

Please add a test.
Does this fix a bug? The only attribute that is common to all optimizers is learning rate.

see for instance https://github.com/facebookresearch/ClassyVision/blob/master/classy_vision/optim/classy_optimizer.py#L162
in classy the user is free to pass any callback that he/she wants to adjust any setting over time, can be LR but can also be something else, the "lr" key is not specific as you can see in the code snippet. Not syncing them means silent failure for the end user when using oss, not very nice.
I'll add a non-lr unit test, good point, I was a bit lazy and thought that lr could be enough but that's wrong

It would be nice to capture some of this discussion in the commit message or via a code comment.

ah, the intent was to have this documented in the linked issue (#66), but if it's easier I can elaborate more in the commit messages or PR ? Alrighty, learning by doing, there are some hidden assumptions

@blefaudeux blefaudeux requested a review from msbaines September 8, 2020 18:29
@msbaines
Copy link
Contributor

msbaines commented Sep 8, 2020

Please add a test.
Does this fix a bug? The only attribute that is common to all optimizers is learning rate.

see for instance https://github.com/facebookresearch/ClassyVision/blob/master/classy_vision/optim/classy_optimizer.py#L162
in classy the user is free to pass any callback that he/she wants to adjust any setting over time, can be LR but can also be something else, the "lr" key is not specific as you can see in the code snippet. Not syncing them means silent failure for the end user when using oss, not very nice.
I'll add a non-lr unit test, good point, I was a bit lazy and thought that lr could be enough but that's wrong

It would be nice to capture some of this discussion in the commit message or via a code comment.

ah, the intent was to have this documented in the linked issue (#66), but if it's easier I can elaborate more in the commit messages or PR ? Alrighty, learning by doing, there are some hidden assumptions

The linked issue is two levels of indirection from the code. It's nice to have a useful summary in the commit message. The commit message is one level of indirection away from the code. When maintaining code, it is common to git log or git blame a file. Understanding the rationale for a change from the code or the commit message is very handy. You have all the information available in the log. Reading the issue requires you to move your attention away from the log.

https://chris.beams.io/posts/git-commit/#why-not-how

Copy link
Contributor

@msbaines msbaines left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please add more context to the commit message before landing.

@blefaudeux blefaudeux merged commit 5a268b2 into master Sep 8, 2020
@blefaudeux blefaudeux deleted the oss_sync_all_attributes branch September 8, 2020 20:35
myleott pushed a commit that referenced this pull request Feb 22, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

[feat] Sync OSS.param_groups and the sharded optimizer param_groups
3 participants