Skip to content
This repository has been archived by the owner on May 27, 2020. It is now read-only.

Time stamp on images incorrect #38

Open
shaun-edwards opened this issue Jun 26, 2017 · 7 comments
Open

Time stamp on images incorrect #38

shaun-edwards opened this issue Jun 26, 2017 · 7 comments

Comments

@shaun-edwards
Copy link
Contributor

The ROS time stamp that is applied to the images coming from the sensor uses ros::Time::now(), see here.

The real time stamp is likely older than this, when taking into account the exposure time and the on-board processing.

Is there a way to estimate this latency and apply it to the ROS time stamp?

@shaun-edwards
Copy link
Contributor Author

FYI @GarrettPlace.

@tpanzarella
Copy link

How accurate and granular do you need this to be Shaun? The camera sends back a time stamp (seconds + nanoseconds) for each image chunk header. So, each individual image in your configured PCIC schema will have a stamp from the camera. However, the camera clock will not be synchronized with the host clock. So, for ultimate accuracy and granularity, 1) a homebrew clock sync scheme would need to be implemented by the driver and 2) a stamp for each image type in the image buffer would need to be kept. The latter is necessary to generalize to the cases for when you dynamically change the PCIC schema you are streaming back from the camera yet reusing the same ImageBuffer.

You are correct in your statement that there is some latency between when the image was actually acquired and how we are stamping it in the ROS code. To get a sense of the calculations you need to contemplate to estimate the true latency, this comment may be helpful.

Realize that since the ROS interface is calling the underlying driver's WaitForFrame, the latency between when the bytes are received by the host and when they are stamped by the ROS code should be minimal -- albeit, I have not measured it, so cannot comment other than qualitatively at the moment.

@shaun-edwards
Copy link
Contributor Author

I understand 1), does the sensor interface have a way to get the current sensor clock? 2) is a little bit above my head since I haven't dug into the driver details, but I'll take your word for it.

Other camera drivers assume a one frame latency or allow the user to set the latency (via ROS parameter). Would you consider this solution, which I admit is a little bit hacky. The Windows software provides a good estimate of the latency (if you assume one frame).

My use case involves syncing an external 2D camera via the time stamp. I notices that the 2D and 3D images didn't quite match up.

@tpanzarella
Copy link

I think I'd be more apt to try to get the time stamp as close to correct as possible. My constraint right now will be finding the time to implement this.

In the meantime, if you wanted to fork and do the ROS parameter thing, I'd be happy to take a look at a diff to see if it is reasonable to mainline (i.e., we can default the behavior to work as it does today, but if you wanted this extra knob to turn for special cases like yours, it is available to you).

@graugans
Copy link
Member

We'll having a NTP feature for the upcoming firmware releases. Maybe this can solve your issue @shaun-edwards with the upcoming firmware you'll able to adjust the camera time by an XML-RPC call. Which maybe needs to be repeated in an regular intervall due to the different drifts of the time sources in your system. I have no idea when and how this could be integrated into the ROS driver.

@tpanzarella
Copy link

tpanzarella commented Jun 29, 2017 via email

@graugans
Copy link
Member

@tpanzarella yes it is intended to be consistent across all cameras. I have to check the details and give some examples. Have to kick out a software release for the 2D cameras first.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants