Skip to main content [aditude-amp id="stickyleaderboard" targeting='{"env":"staging","page_type":"article","post_id":2355397,"post_type":"story","post_chan":"none","tags":"category-autos-vehicles,category-computers-electronics","ai":true,"category":"none","all_categories":"ai,business,dev,entrepreneur,mobile,transportation,","session":"A"}']

Luminar’s lidar gives self-driving cars laser vision

Austin Russell, founder and CEO of lidar startup Luminar Technologies, is much taller in person than in pictures.

We met at a hotel lobby in Midtown Manhattan, where he and cofounder Jason Eichenholz were recuperating after a whirlwind New York City press tour last week. Luminar was named one of CNBC’s Disruptor 50 companies, and Russell had just given a televised interview. He was wrapping up a phone call as I walked up a flight of stairs and through the door of the second-floor lounge.

[aditude-amp id="flyingcarpet" targeting='{"env":"staging","page_type":"article","post_id":2355397,"post_type":"story","post_chan":"none","tags":"category-autos-vehicles,category-computers-electronics","ai":true,"category":"none","all_categories":"ai,business,dev,entrepreneur,mobile,transportation,","session":"A"}']

“It’s been busy, yeah,” he said, as I sat down.

Range is king

There’s a reason Russell is in high demand. Luminar, a nearly 6-year-old lidar developer that emerged from stealth with a $36 million funding round in 2017, has designed one of the world’s first sensors capable of detecting objects up to 250 meters away.

AI Weekly

The must-read newsletter for AI and Big Data industry written by Khari Johnson, Kyle Wiggers, and Seth Colaner.

Included with VentureBeat Insider and VentureBeat VIP memberships.

The mechanical sensor, like other lidar devices, measures the distance between itself and objects by calculating the time it takes a laser pulse to scatter off of reflective surfaces. But Luminar’s lidar distinguishes itself by operating at the 1550 nanometer wavelength, which the company claims can deliver 40 times more power and 50 times better resolution than competing devices. More importantly, it has a 120-degree field of view — two laser beams each cover a 60-degree field, steered by tiny mirrors that direct the beams through the field of coverage — and can detect car tires, bicyclists in black sweatshirts, and other objects with reflectivity as low as 5 percent, without sacrificing range.

“We built it from the ground up: our own receivers, scanning mechanisms, processing electronics all in-house,” Russell told me.

Wavelength is key

Luminar’s secret sauce is indium gallium arsenide, an alloy that’s amenable to the 1550 nanometer operating wavelength. Traditional photodetectors and application-specific integrated circuit (ASIC) designs, which use silicon, operate at a wavelength of about 900 nanometers.

“[1550 nanometers] is basically the fundamental operating wavelength that you want to be at if you want to be even theoretically capable of high range and resolution performance, because it’s an eye-safe wavelength,” Russell said. “There’s actually a huge challenge with current systems … The reason why they can’t see very far is because if you were to increase the laser power any more, you’d start damaging people’s eyes. With 1550 nanometers, because it’s eye-safe rather than focused to a point on the retina … You could effectively output 1 million times the pulse energy and still stay eye-safe, which is crazy.”

“It’s hard to argue with physics and eye safety,” Russell said.

Luminar’s indium phosphide semiconductor wafers also produce a much higher-fidelity image than lower-wavelength lidar systems — as low as 1 frame per second for the highest level of detail or up to 20 frames per second at the cost of some resolution.

[aditude-amp id="medium1" targeting='{"env":"staging","page_type":"article","post_id":2355397,"post_type":"story","post_chan":"none","tags":"category-autos-vehicles,category-computers-electronics","ai":true,"category":"none","all_categories":"ai,business,dev,entrepreneur,mobile,transportation,","session":"A"}']

During a zip around Madison Square Park in a van with a Luminar lidar strapped to the roof (and a large computer monitor stuck to the back of the passenger seat headrest), Russell showed me the raw sensor data. A colorful, undulating point cloud, rendered with a latency of mere milliseconds, converged around vehicles, pedestrians, street signs, and even lane markings that Russell zoomed in and around from multiple vantage points with arrow keys on a keyboard.

He pointed to a headrest in a car a few lanes to the right. The lidar had detected it through the rear windshield.

“Once you’re working with data that is orders of magnitude better fidelity, it makes a lot of the problems that were seemingly impossible from a perception standpoint a heck of a lot easier,” Russell said. “Some of our customers have effectively gotten to the point where they’re able to reduce their computer for certain algorithms where … they can run on something as small as a Raspberry Pi.”

Those algorithms can increase or decrease the density of the point cloud dynamically, Russell explained. They might focus on the horizon when the car’s on the freeway and hone in on pedestrians during city driving.

[aditude-amp id="medium2" targeting='{"env":"staging","page_type":"article","post_id":2355397,"post_type":"story","post_chan":"none","tags":"category-autos-vehicles,category-computers-electronics","ai":true,"category":"none","all_categories":"ai,business,dev,entrepreneur,mobile,transportation,","session":"A"}']

We had the good fortune of bright and sunny weather on the day of our joyride, but Russell said that the lidar’s fidelity is such that snow and rain have a minimal impact on range. “You basically have to over-spec your sensor for clear weather conditions, such that when you have inclement weather, you can still see objects really, really well,” he said.

Expanding operations

With the lidar design on lockdown, Luminar has focused most of its efforts over the past few months on mass production. Its 136,000 square foot manufacturing facility in Orlando, Florida can produce a sensor every eight minutes, Russell told me. (The company’s target is 5,000 units per quarter.) These sensors are 30 percent lighter and more power-efficient than the prototypes the company’s optics engineers used to assemble by hand.

A recent acquisition helped Luminar scale up relatively quickly. In April, Luminar purchased Black Forest Engineering, a Colorado Springs-based company that specializes in working with the aforementioned indium gallium arsenide material. Bringing the team of 30 engineers in-house “reduced the cost of [our] lidar sensor from what originally would have been tens of thousands to just $3,” Russell said. “A 3-inch array [of indium gallium arsenide] used to cost $30,000, but we use very little — about the width of a human hair.”

Luminar’s technology was impressive enough to convince four automakers to come aboard as strategic partners, two of which Russell declined to name. In September, Luminar collaborated with Toyota’s research arm, Toyota Research Institute, on the company’s autonomous car platform. The automaker’s latest generation — Platform 3.0 — was announced at the Consumer Electronics Show in Las Vegas in January and features a Lexus LS 600hl equipped with four Luminar lidar sensors, camera arrays, and radars. Toyota calls it “one of the most perceptive automated driving test cars on the road.”

[aditude-amp id="medium3" targeting='{"env":"staging","page_type":"article","post_id":2355397,"post_type":"story","post_chan":"none","tags":"category-autos-vehicles,category-computers-electronics","ai":true,"category":"none","all_categories":"ai,business,dev,entrepreneur,mobile,transportation,","session":"A"}']

Despite Luminar’s early successes, however, it faces something of an uphill battle.

Velodyne, one of the largest lidar manufacturers in the world, supplies sensors to Ford (a Velodyne investor), Uber, Waymo, Otto, and others.

And Intel subsidiary Mobileye, which develops camera-based advanced driver-assistance systems (ADAS) with collision prevention and mitigation capabilities, counts BMW and Fiat Chrysler among its customers.

But Russell is confident that Luminar has the upper hand.

[aditude-amp id="medium4" targeting='{"env":"staging","page_type":"article","post_id":2355397,"post_type":"story","post_chan":"none","tags":"category-autos-vehicles,category-computers-electronics","ai":true,"category":"none","all_categories":"ai,business,dev,entrepreneur,mobile,transportation,","session":"A"}']

“You have to have a … fast frame rate, you have to have really great precision on each measurement, you have to have something that doesn’t interfere with other sensors of this type or interfere with sunlight, that can work in rain, fog, and snow reliably, all the while having a highly scalable solution that can make millions with a secure supply chain, low assembly time, FDA Class 1 certification, EAR99 certification from the Department of Commerce, and still make an auto-grade system that can last … over huge temperature ranges and meet the shock environment requirements and still be able to produce at low cost,” Russell enthused.

“We basically have a monopoly on the optics-specific manufacturing talents — a collective millennium of experience building and scaling lidar systems … [and] we mapped out 2,000 ways how not to build a lidar.”

Battling public perception

The elephant in the room whenever anyone talks about self-driving technologies is Uber.

After a pedestrian fatality in Tempe, Arizona involving one of the company’s self-driving cars, Uber suspended autonomous tests nationwide. In May, after the governor suspended Uber’s ability to test autonomous cars on public roads, the company shut down operations in the state.

[aditude-amp id="medium5" targeting='{"env":"staging","page_type":"article","post_id":2355397,"post_type":"story","post_chan":"none","tags":"category-autos-vehicles,category-computers-electronics","ai":true,"category":"none","all_categories":"ai,business,dev,entrepreneur,mobile,transportation,","session":"A"}']

Uber isn’t the first company to shake the public’s confidence in self-driving technology.

In 2016, a Tesla Model S equipped with Mobileye’s EyeQ3 technology failed to distinguish the white-colored body of a lane-changing truck from the sky, contributing to the death of a 40-year-old Ohio man. (An investigation by the NTSB found that the driver’s hand had been on the wheel only a few seconds in the minute before the crash.) More recently, a software engineer in California was killed when his Tesla Model fX, with its autonomous driving features engaged, slammed into a concrete barrier.

And during a recent demonstration of Mobileye’s self-driving car platform, with a gaggle of reporters looking on, a test vehicle sped through a red light. (The company’s CEO blamed the mistake on electromagnetic interference from the wireless transmitters on television crew cameras.)

Unsurprisingly, consumers aren’t particularly bullish on autonomous cars. Two studies conducted in 2018, one by CarGurus.com and the other by AAA, found that roughly 73 percent of those surveyed don’t trust self-driving car technology in its current state.

Russell is all too aware of the challenges ahead.

“If you want to build a self-driving demo, you could go all vision, and it works 99 percent of the time,” Russell told me. “But you might hit one out of every hundred people. You miss the edge case scenarios. “I think there’s a lot of hype around urban ridesharing and other more complex environments, but there’s a massive disconnect between where the industry is at and where it needs to be to achieve those things with a [high] level of performance.”

“There’s no truly autonomous car on the road today,” Russell said. “Anyone who’s doing an autonomous demo without any drivers in it … is almost certainly not doing that or they’d be putting people’s lives at risk.”

VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Learn More