Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add 'None' option for point_reduction in chamfer_distance function #622

Closed
nematoli opened this issue Apr 1, 2021 · 4 comments
Closed
Assignees
Labels
enhancement New feature or request

Comments

@nematoli
Copy link

nematoli commented Apr 1, 2021

🚀 Feature

Currently point_reduction entry in the chamfer_distance function should either be "mean" or "sum". That would be great to have the option of accessing output without any reduction.

Motivation

As the chamfer distance uses nearest neighbor for data association it is usually quite noisy. In practice it would be helpful to also enforce smoothing constraints (such as edge-aware smoothness) on the computed distance transform to reduce the noise. That's why having access to the not-reduced form of distance transform can be helpful.

Pitch

if point_reduction is not None:
    # Apply point reduction
    cham_x = cham_x.sum(1)  # (N,)
    cham_y = cham_y.sum(1)  # (N,)
    if return_normals:
        cham_norm_x = cham_norm_x.sum(1)  # (N,)
        cham_norm_y = cham_norm_y.sum(1)  # (N,)
    if point_reduction == "mean":
        cham_x /= x_lengths
        cham_y /= y_lengths
        if return_normals:
            cham_norm_x /= x_lengths
            cham_norm_y /= y_lengths
    if batch_reduction is not None:
        # batch_reduction == "sum"
        cham_x = cham_x.sum()
        cham_y = cham_y.sum()
        if return_normals:
            cham_norm_x = cham_norm_x.sum()
            cham_norm_y = cham_norm_y.sum()
        if batch_reduction == "mean":
            div = weights.sum() if weights is not None else N
            cham_x /= div
            cham_y /= div
            if return_normals:
                cham_norm_x /= div
                cham_norm_y /= div
cham_dist = cham_x + cham_y
cham_normals = cham_norm_x + cham_norm_y if return_normals else None
return cham_dist, cham_normals
@gkioxari gkioxari self-assigned this Apr 3, 2021
@gkioxari
Copy link
Contributor

gkioxari commented Apr 3, 2021

You can submit a PR and we are happy to review and accept it!

@gkioxari gkioxari added the enhancement New feature or request label Apr 3, 2021
@walsvid
Copy link

walsvid commented Aug 19, 2021

The two point clouds used to calculate the Chamfer loss may have different numbers of points, so adding the bi-directional loss directly without reduction will cause the shape mismatch. If Add 'None' option for point_reduction in the chamfer_distance function, I think a better solution is to return the loss in the form of a tuple.

@yuvalH9
Copy link

yuvalH9 commented Feb 24, 2022

@nematollahi, @gkioxari , @walsvid is there any new with this issue?

@haritha-j
Copy link
Contributor

I've found that I need this as well, I'll submit a PR.

facebook-github-bot pushed a commit that referenced this issue Aug 15, 2023
Summary:
The `chamfer_distance` function currently allows `"sum"` or `"mean"` reduction, but does not support returning unreduced (per-point) loss terms. Unreduced losses could be useful if the user wishes to inspect individual losses, or perform additional modifications to loss terms before reduction. One example would be implementing a robust kernel over the loss.

This PR adds a `None` option to the `point_reduction` parameter, similar to `batch_reduction`. In case of bi-directional chamfer loss, both the forward and backward distances are returned (a tuple of Tensors of shape `[D, N]` is returned). If normals are provided, similar logic applies to normals as well.

This PR addresses issue #622.

Pull Request resolved: #1605

Reviewed By: jcjohnson

Differential Revision: D48313857

Pulled By: bottler

fbshipit-source-id: 35c824827a143649b04166c4817449e1341b7fd9
@bottler bottler closed this as completed Dec 5, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

6 participants