In chrono:sensor:ChLidarSensor, the synthetic data is generated via GPU-based ray-tracing. By leveraging hardware accelerated support and the headless rendering capabilities provided by NVIDIA Optix Library. For each lidar beam, a group of rays are traced that sample that lidar beam. The number of samples, along with beam divergence angle, are set by the user. The entire frame/scan of the lidar is processed in a single render step. To account for the time difference of rays across the scan, keyframes and motion blur techniques are used. With these keyframes, each beam in the scan traces the scene at a specific time, reproducing the motion of objects and the lidar. The intensity returned by the lidar beams is based on diffuse reflectance.
Creating a Lidar
parent_body,
update_rate,
offset_pose,
horizontal_samples,
vertical_channels,
horizontal_fov,
max_vert_angle,
min_vert_angle,
max_distance,
beam_shape,
sample_radius,
vert_divergence_angle,
hori_divergence_angle,
return_mode,
clip_near
);
lidar->SetName("Lidar Sensor");
lidar->SetLag(lag);
lidar->SetCollectionWindow(collection_time);
std::shared_ptr< T > make_shared(Args &&... args)
Replacement for make_shared guaranteed to use operator new rather than placement new in order to avoi...
Definition ChTypes.h:66
Lidar Filter Graph
manager->AddSensor(lidar);
Lidar Data Access
UserXYZIBufferPtr xyzi_ptr;
while () {
xyzi_ptr=lidar->GetMostRecentBuffer<UserXYZIBufferPtr>();
if(xyzi_ptr->Buffer) {
PixelXYZI first_point= xyzi_ptr->Buffer[0];
std::cout<<"First Point: [ "<<unsigned(first_point.x) <<", "<<
unsigned(first_point.y) <<", “ <<unsigned(first_point.z) <<", "<<
unsigned(first_point.intensity) <<" ]"<<std::endl;
}
}