Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

add None option for chamfer distance point reduction #1605

Closed
wants to merge 5 commits into from

Conversation

haritha-j
Copy link
Contributor

The chamfer_distance function currently allows "sum" or "mean" reduction, but does not support returning unreduced (per-point) loss terms. Unreduced losses could be useful if the user wishes to inspect individual losses, or perform additional modifications to loss terms before reduction. One example would be implementing a robust kernel over the loss.

This PR adds a None option to the point_reduction parameter, similar to batch_reduction. In case of bi-directional chamfer loss, both the forward and backward distances are returned (a tuple of Tensors of shape [D, N] is returned). If normals are provided, similar logic applies to normals as well.

I'm not sure if this is the best return format, or if it should be returned as a padded Tensor (shape [2, D, N]) instead. It would be great if this could be clarified. I didn't add any test cases since they would depend on the return format.

This PR addresses issue #622.

@facebook-github-bot facebook-github-bot added the CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. label Aug 2, 2023
@bottler
Copy link
Contributor

bottler commented Aug 7, 2023

Thank you!

I know we had a None option which was not useful and was removed in 790eb8c . I think this is not the same as that, right?

Can you specify (in the docstring(s)) the shape of the output in the new case and add a test for the new case?

@haritha-j
Copy link
Contributor Author

haritha-j commented Aug 14, 2023

Yes, this is quite different. Before 790eb8c setting point_reduction to None would really just result in point reduction anyway because batch_reduction would cause at least one reduction, and there was a check that prevented both point_reduction and batch_reduction from being None. The commit was to prevent this.

This PR allows both of these to be None, allowing the return of the individual loss terms for each point.

The return shape is specified in the doc string and tests have been added.

pytorch3d/loss/chamfer.py Outdated Show resolved Hide resolved
pytorch3d/loss/chamfer.py Outdated Show resolved Hide resolved
haritha-j and others added 2 commits August 14, 2023 13:34
Co-authored-by: Jeremy Reizenstein <[email protected]>
Co-authored-by: Jeremy Reizenstein <[email protected]>
@facebook-github-bot
Copy link
Contributor

@bottler has imported this pull request. If you are a Meta employee, you can view this diff on Phabricator.

@facebook-github-bot
Copy link
Contributor

@bottler merged this pull request in d84f274.

@bottler
Copy link
Contributor

bottler commented Aug 16, 2023

Thank you!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. Merged
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants