While the focus in the automotive industry is often on aesthetics, one company has created the next generation of LiDAR systems that will help self-driving vehicles avoid crashes.
AEye, a California-based company that creates advanced vision hardware, software and algorithms for autonomous vehicles, has developed iDAR, an intelligent data collection system that fuses LiDAR, computer vision and artificial intelligence to deliver advancements in perception and motion planning for autonomous vehicles.
Allan Steinhardt, Ph.D., the chief scientists for AEye explained in an interview with R&D Magazine how the new system represents an improvement of past generations of LiDAR.
“The way you get solid state is through means of micromotion, so you have a very compact, integrated system and you use very, very small devices that are microscale or nanoscale so that they can move very quickly without having a lot of inertia and also being very insensitive to shock and vibration from hitting potholes,” he said. “We said rather than just having a solid state system to emulate a legacy LIDAR, we want to use the flexibility we have.
“Our primary mission and our primary goal is for safety and it is to detect and range on objects,” Steinhardt added. “The real advantage is when we fuse the information with the sensors. We have a camera that is aligned with the LIDAR.”
The new system differs from first generation LiDAR technology, whose siloed sensors, rigid asymmetrical data collection methods and post-processing, lead to over and under sampling of information and latency. The new system optimizes data collection, enabling it to transfer less data that is higher in quality and relevance, for rapid perception and path planning.
iDAR combines the world’s first agile MOEMS LiDAR, pre-fused with a low-light camera and embedded AI to create software-definable and extensible hardware that can dynamically adapt to real-time demands. The LiDAR system can target and identify objects within a scene 10-to-20 times more effectively than LiDAR-only productions.
A 1550-nm laser allows iDAR to interrogate 100 percent of the scene, while typical fixed pattern LiDAR systems are only capable of interrogating 6 percent of any scene due to the large vertical gaps inherent in those type of systems.
According to Steinhardt, the iDAR system mimics how a human’s visual cortex focuses on and evaluates potential driving hazards. The system uses a distributed architecture and at-the-edge processing to track targets and objects of interest, while also critically assessing the general surroundings.
“If you are out in fog or you are out at night or you are out in snow, you actually can do a pretty good job of navigating, but you do it by changing where you focus your attention,” he said. “If you look at snow, rain, fog, we can’t change physics, you are going to get a lot more loss in laser in virtually any system under bad weather conditions.
“What you can do is you can focus the energy on precisely where you need to focus to minimize danger of collision,” Steinhardt added. “Lasers can see through fog, clouds and snow. You have to process the data in a more sophisticated way.”
iDAR also has path planning software that enables customizable data collection in real-time. This enables the system to adapt to the environment and dynamically change performance based on the user’s applications and needs.
The system emulates classic legacy LiDAR, defines regions of interest, focuses on threat detection and is programmed for variable environments including highway or city driving.
Steinhardt said that the new system should be available this upcoming summer.
“We are pretty close, we have a demonstration unit that we will have available for evaluation by customers that are interested in coming forward,” he said.
Should this by a range of objects?