On the podcast, Recode Decode with Kara Swisher, Ford’s Chief Technology Officer Ken Washington recently said that by 2021, we would begin to see cars with no one in the driver’s seat. And on Monday, Tesla CEO Elon Musk presented the company’s self-driving alongside a few audacious claims which proved that the discourse over which technologies make autonomous cars safe is still open for debate.

“Self-driving car technology has made incredible advances over the last five years. Vastly improved vision technology combined with inputs from other sensors are getting us close to full autonomy,” said Bart Selman, professor of computer science at Cornell University. “However, we don’t yet know whether we can reach the level of safety of a human driver within the next three to five years.”

In Tesla’s case, significant reliance on computer vision introduces an extra level of difficulty. “Current computer vision systems can fail in quite unpredictable ways and having multiple sensors, ideally including LiDAR are critical,” said Selman through a press statement from Cornell University. “The challenge remains of how to resolve possibly conflicting information of multiple sensors, as well as the question of how to gracefully handle unexpected situations without needing human input.”

Last year, the California Department of Motor Vehicles which compiled data based on the miles driven by autonomous vehicles. This ranked Google’s Waymo at the top for the least amount of times a human had to take over from the car.

AEye's agile LiDAR

AEye's agile LiDAR allows iDAR to deploy what the company calls regions of interest or situational demands within the driving scene.

“Alert human drivers are surprisingly good at interpreting unexpected events and generally can take the necessary preventive steps to avoid accidents. However, because current autonomous driving systems lack a broader understanding of their environment, it is difficult for those systems to take similar preventive measures,” added Selman.

The answer for the safety of autonomous vehicles boils down to its ability to perceive its surroundings like a human. This is something that AEye has been taking seriously. The startup which has raised $60 million from Kleiner Perkins, Taiwania Capital, Intel Capital, Airbus Ventures, LG, Hella and Subaru for their Intelligent Detection and Ranging (iDAR) system which replaces current passive LiDAR sensors.

In a nutshell, IDAR is a perception system that acts like a human brain where it’s able to intelligently learn its surroundings instead of being overwhelmed by it. The company leverages artificial intelligence to help autonomous vehicles think more like a robot but understand like a human.

“While driving, a human pays attention to certain things, like a child walking in front of a car, while filtering out others, like a bird in the sky. And we are creating this same type of filtering or prioritization for autonomous vehicles,” said Blair LaCorte, President of AEye.

The company was started with a sheet of paper with one sentence written on it: ‘What’s the best way to deliver artificial perception for robotic and autonomous vehicles’? LaCorte said. Instead of basing its answers on technology, the company used a model that would inform them about how to think about the solution, which was the human visual cortex.

Autonomous Car Technology

Instead of basing its answers on technology, the company used a model that would inform them about how to think about the solution, which was the human visual cortex.

“Humans process some 70% of the spatial and environmental data they receive in the visual cortex, rather than sending the data to the brain. The human visual cortex filters out in real-time, information that the brain doesn’t need to make effective decisions,” said LaCorte. “The complexity of the data being processed in the human visual cortex includes multiple complex dimensions – colour, space, time, distance, vector, velocity. In effect, intelligence is pushed to the edge of the human perception network.”

The same concept was then applied to artificial perception – asking ‘how can we mimic the human visual cortex with technology’. LaCorte explains that the human visual cortex analyses spatial and temporal aspects of their situation simultaneously, constantly re-interrogating the surroundings while identifying objects that are ‘of interest’ or are ‘potential threats’.

AEye introduced the term agile LiDAR when talking about their iDAR system. This allows iDAR to deploy what the company calls regions of interest or situational demands within the driving scene and then deploys an array of pulses which can instantly capture large amounts of data including vector and velocity allowing it to triage data in real-time.

“We believe that AEye gives perception engineers their first tool kit to actively explore how to best interrogate different scenes a car encounters as it drives,” said LaCorte.