TiM$10K team from Rochester Institute of Technology convert 2D LiDAR sensor into a 3D LiDAR sensor

Many companies are interested in using 3D LiDAR technology, but not all of them can afford it. 2D LiDAR is a great alternative to 3D, but who says it has to remain 2D? For the second year, students from across the country submitted projects to SICK, Inc.’s annual TiM$10K Challenge. The third-place team hails from the Rochester Institute of Technology and developed a method of extending single beam LiDAR to full resolution.

The team from Rochester Institute of Technology (RIT) consisted of Yawen Lu, Yuxing Wang, and Devarth Parikh. Their advisor was Guoyu Lu, Assistant Professor at the Chester F. Carlson Center for Imaging Science at RIT.

So, what is the TiM$10K Challenge? In this challenge, SICK reached out to universities across the nation that were looking to support innovation and student achievement in automation and technology. Participating teams were supplied with a SICK 270° LiDAR, a TiM, and accessories. They were challenged to solve a problem, create a solution, and bring a new application that utilizes the SICK scanner in any industry.

Background on the Project

LiDAR sensors have been used worldwide for decades. With the recent advancements in high digit resolution camera sensors and artificial intelligence, imaging technologies have received a tremendous boost. The cost of complementary metal oxide semiconductor (CMOS) sensors for cameras has drastically reduced, which leads cameras to be standard equipment in many commercial products, like mobile phones.

LiDAR is a remote sensing technology which measures the distance to an object by illuminating the target with laser and then analyzing the reflected light. This allows the sensor to calculate the correct distances between objects.

For a 2D LiDAR sensor, only one laser beam is necessary. 2D sensors often use a spin movement to collect data on X and Y axes. 2D sensors are suitable for performing detection and ranging tasks. For a 3D LiDAR, the idea is the same, but several laser beams spread out on the vertical axe are shot to get data on X, Y and Z axes. 3D LiDAR applications may include terrestrial scanning and mapping.

In stark contrast, high beam LiDAR involves Infrared sensors made from Indium Gallium Arsenide (InGaAs), which can cost up to tens of thousands of dollars. Though 3D sensors are more complex and expensive than 2D sensors, they can estimate the scene depth accurately.

How 3D LiDAR with a Camera Works

At its core, the team’s project is a training framework.

2D LiDAR-based distance measurement coupled with high-resolution optical image can be extrapolated using a Deep-Learning-based technique of single image depth estimation approach to obtain information similar to that of a 3D LiDAR, but at a much-reduced cost. Their proposed solution takes 2D LiDAR point array and stereo images (left image L and right image R) as training inputs.

After training, only one single image and one LiDAR beam can produce high-precision depth maps in real applications. In the offline stage, the single-beam LiDAR is calibrated with a common digital camera, which builds correspondences between the LiDAR points and the image pixels. Single image depth estimation network is also trained offline based on ResNet-18 structure from a stereo camera setting.

In the online process, a depth map can be estimated from the single image depth estimation system with an image input from only one camera, and the Lidar points can be mapped to the image from the LiDAR-camera calibration matrix. By fitting the single beam image pixels’ depth to LiDAR points’ depth, a fine-tuned Multi-layer Perceptron (MLP) network is capable of estimating the correct scale justification for the depth values in the image, achieving the purpose of extending single-beam LiDAR to full resolution.

Market Application of the Product

“Our system of a 2D LiDAR, coupled with a camera, provides not only an affordable alternative to the 3D LiDAR sensors, but it also provides a comparatively higher resolution at a much faster rate, with the ability to determine per-pixel depth,” said Lu.

The LiDAR market is rapidly evolving with many established sensor manufacturers as well as start-ups jumping into the autonomous driving market. A study by found that autonomous vehicles alone are projected to increase the demand in LiDAR sensors 20x in the next 10 years.

“Thus, our system opens up a potential market for all the applications which demand high-resolution information at a considerably lower cost,” explained Parikh. “High-resolution data obtained from LiDAR’s can be used to analyze soil, identify crop growth, and terrain mapping for agriculture.”

2d v 3d lidar cost

Why 2D Sensors?

The team was quick to address concerns of using modified 2D LiDAR sensors instead of 3D ones for traditionally 3D applications. The answers were simple but based on thorough research which they had conducted for this project.

  • 2D LiDAR sensors are more portable and efficient than 3D LiDAR sensors
  • The system is capable of predicting distant objects without substantial physical limitations
  • Modern 3D LiDAR systems have a sparse point cloud, which have the possibility of missing objects during scans. The pixel-level depth from the team’s system provides users with a much higher resolution than 3D LiDAR
  • A standalone LiDAR accuracy decreases significantly in bad weather like snow, rain, or dust. Fusing the two sensors adds more robustness to the entire system.
  • Their system is able to conduct online adjustment continuously to refine the output from the disparity estimation network, which enables it to be applied to real-world applications in different types of scenes, even without training.

Register for the 2020-2021 TiM$10K Challenge

SICK is now accepting entries for the TiM$10K Challenge for the 2020-2021 school year! Student teams can register online by September 14, 2020.  Student teams are encouraged to use their creativity and technical knowledge to incorporate the SICK LiDAR for any industry in any application. Advisors/professors are allowed to guide the student teams as required.

This contest was supported by PMMI Foundation’s U Skills Fund. PMMI Foundation works to grow awareness of careers in packaging and processing, providing assistance to schools and programs that develop students to excel in the industry.