Skip to content
This repository has been archived by the owner on May 27, 2020. It is now read-only.

Saturated pixels #20

Open
bsbc opened this issue Apr 15, 2016 · 4 comments
Open

Saturated pixels #20

bsbc opened this issue Apr 15, 2016 · 4 comments

Comments

@bsbc
Copy link

bsbc commented Apr 15, 2016

When using the VisionAssistant SW it is possible to see "saturated" pixels.

However, o3d3xx-ros does not seem to publish information about saturation.

We wanted to use the saturation info to distinguish between pixels that are either too close or too far away and do not report valid point-cloud data:

  • typically close pixels in addition to having very low amplitude and confidence values, they are also over-saturated
  • distant pixels also have low amplitude and confidence, but not low saturation.

How would it be possible to access the saturation info using o3d3xx-ros package?

Alternatively, would there be a different way to differentiate between "close" vs. "far away" objects/pixels that do not report valid depth data?

Thanks

@tpanzarella
Copy link

How would it be possible to access the saturation info using o3d3xx-ros package?

This saturation information is reported in the confidence image which o3d3xx-ros publishes on /o3d3xx/camera/confidence (by default). The image encoding is mono8. The saturation bit is bit 1 (0-based). It will be set high (i.e., 1) if the pixel is saturated.

@bsbc
Copy link
Author

bsbc commented Apr 15, 2016

Thanks.

Is this (looking at the saturation info) the recommended way to determine whether a pixel that does not have valid depth data is too close or too far?

@tpanzarella
Copy link

tpanzarella commented Apr 16, 2016

Looping in @graugans @cfreundl @GarrettPlace to take back to the team in Germany for an "official" response to this question. I cannot confidently advise you to make this inference one way or another.

My unofficial initial thoughts would be to try to make that inference by consulting a neighborhood of surrounding pixels of the suspect one as I am not aware of a piece of information from the PMD chip that gives that information directly.

UPDATE:

Thinking about this more... I do not think I would make the "too close" / "too far" determination based solely on the pixel saturation bit. I believe pixel saturation will be more closely correlated to the scene you are observing and the configured integration times (i.e., longer integration times could lead to overly saturated pixels). The confidence image reserves two bits (4 and 5) to reveal which exposure time was used to compute the measurement for the pixel. If you are running in single exposure mode, then this doesn't really give you any additional information however if you are running in double or triple exposure mode, these bits may be able to help you.

@graugans
Copy link
Member

As @tpanzarella already suggested check the pixels in the neighborhood of your pixels. The intensity of a single pixel depends on multiple factors

  • Distance
  • Illumination time
  • reflectivity in the IR spectrum
  • the surface is is glossy or something like a cat eye

Under IR light black sometimes has good reflectivity one might not expect. The confidence information + amplitude information give some indication but in my experience may not enough. But I am not an algorithm guy I do the base software

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants