AEYE Gives Driverless Cars a Human-Like View of the World. Luis Dussan thinks that like human beings, autonomous cars should have the ability to see outside the windshield of a car that gets warped by the inner workings of the brain  prioritizing the detail at the very center of the scene while keeping attention on the peripheries to spot dangers.

AEye, Luis Dussan’s startup has built a new kind of hybrid sensor that seeks to make that idea a reality. The device does contain a solid-state lidar, a low-light camera, & chips to run embedded artificial-intelligence algorithms that is able to reprogram on the fly how the hardware is being used, allowing the system to prioritize where it is looking to give vehicles a more refined view of the world.

Founder and CEO of AEye initially planned to build AI to assist vehicles drive themselves, but soon he found that sensors on the market could not provide the data he wanted to use this idea made him realize that they need to build their own hardware, and they did so.

Must Read: Zong 4G inaugurates state of the art 4G Research Lab at LUMS

Most of the autonomous cars use lidar sensors, that bounce laser beams off nearby objects, and create accurate 3-D maps of their surroundings. The best versions that are available commercially has been made by market leader Velodyne and are mechanical, rapidly sweeping as many as 128 stacked laser beams in a full 360 degrees around the vehicle.

But even though those sensors are good, there are a couple of problems with them. They’re expensive, they don’t offer much flexibility, because the lasers point out at predetermined angles, means that a car might capture a very detailed view of the sky as it crests a hill, it can say, or look too far off into the distance during low-speed city driving, and there is no way to change it.

AEYE is the leading alternative, solid-state lidar that uses electronics to quickly steer a laser beam back & forth to get the same effect as mechanical devices. Several companies have seized on the technology as they can be made cheaply. But the resulting sensors, are on offer for as little as $100, and scan a regular, unvarying rectangular grid and do not offer the standard of data required for driving at highway speeds.

AEye looks to use solid-state devices a bit differently, programming them to spit out laser beams in some focused areas instead of some regular grid. The firm has not revealed the detailed specifications on how accurately it can steer the beam yet, but it  says it should be able to see as far as 300 meters with an angular resolution as small as 0.1 degrees, and that’s amazing.

The performance of the sensor comes with a caveat: AEye’s setup does not scan a whole scene in high levels of detail. It is limited to either scan the whole area in lower resolution or scan smaller areas in higher resolution.It can be reprogrammed at will.

 One can trade resolution, scene revisit rate, & range at any point in time. The device can also be using data from its camera in some neat ways. It can add color to raw lidar images.

Autonomous-car firms are expected not to choose to use this kind of imaging by itself but they may choose to use a suite of regular sensors to complement a device such as AEye for safety concerns.

LEAVE A REPLY

Please enter your comment!
Please enter your name here