-
Notifications
You must be signed in to change notification settings - Fork 18
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Lidar Data for Semantic Segmentation Test Set? #63
Comments
Hi @liupeng3425 Is this specific to LIDAR data? The timestamps between images and semantic labels are as expected? |
Hi @magehrig However, I failed to find the corresponding lidar data by timestamps when iter over the lidar data. My code is like below: from rosbags.highlevel import AnyReader
from pathlib import Path
# get the timestamps of test data
path = 'test/zurich_city_15_a/zurich_city_15_a_semantic_timestamps.txt'
timestamp_data = set()
with open(path, "r") as f:
data = f.read().splitlines()
for line in data:
timestamp_data.add(int(line))
# read lidar data from bag file
lidar_path = 'lidar_imu/data/zurich_city_15/lidar_imu.bag'
with AnyReader([Path(lidar_path)]) as lidar_data:
conn = lidar_data.connections
conn = [i for i in conn if i.topic == '/velodyne_points']
for connection, timestamp, rawdata in lidar_data.messages(connections=conn):
if timestamp/1000 in timestamp_data: # !!can't find expected lidar data
msg = lidar_data.deserialize(rawdata, connection.msgtype)
# process lidar data |
It is unlikely that there is an exact match. Are you able to retrieve a lidar pointcloud close to these timestamps? |
Dear authors,
I would like to express my gratitude for your work. I have been using the lidar data downloaded from your source for my experiments. However, I have encountered an issue where the timestamps of the lidar data and the semantic labels do not correspond to each other. I wanted to inquire whether the lidar data includes point cloud data corresponding to the semantic segmentation test set.
Thank you for your attention to this matter.
The text was updated successfully, but these errors were encountered: