Robotics & Automation

Your Self-Driving Car's Eyes Can Burn Your Phone's Camera. Here's How LiDAR Actually Works.

The tech that lets autonomous vehicles see the world is more complex, more fragile, and more misunderstood than you think.

Liza Chan
Liza ChanAI & Emerging Tech Correspondent
December 6, 20255 min read
Share:
Close-up of an autonomous vehicle LiDAR sensor with transparent housing revealing internal components, overlaid with a 3D point cloud visualization of an urban street scene at dusk.

SEO METADATA

LiDAR has become the defining technology of the autonomous vehicle revolution. Those spinning cylinders and sleek sensor boxes perched atop robotaxis aren't just cool-looking accessories. They're sophisticated systems that bounce lasers off everything around them to build real-time 3D maps of the world. But as Main Street Autonomy's comprehensive new breakdown reveals, there's a lot the industry would rather you not know, from sensors that can fry your smartphone camera to "solid state" claims that are mostly marketing fiction.

What LiDAR Actually Does

Unlike a camera that passively captures light, LiDAR is an active sensor. It fires laser pulses into the environment and measures how long they take to bounce back. Simple physics: light travels at a known speed, so timing the round trip gives you distance.

The result is a "point cloud," essentially thousands of precise 3D coordinates representing every surface the laser touched. Where cameras give you resolution and color, LiDAR gives you geometry. And for a self-driving car trying to determine whether that blob 50 meters ahead is a pedestrian or a trash can, geometry is everything.

The Detector Wars: APD vs. SPAD

The magic happens in the detectors. Two technologies dominate the space, and understanding them explains a lot about why certain companies have succeeded while others have failed.

Avalanche Photodiodes (APDs) are the workhorses. They're reliable, operate in linear mode, and output analog signals. The downside? They require expensive analog-to-digital conversion and can't detect single photons.

Single-Photon Avalanche Diodes (SPADs) are the rising stars. They're so sensitive they can detect a single photon, essentially working like a Geiger counter for light. The game-changer: SPADs are compatible with standard CMOS manufacturing processes, the same ones used to make smartphone chips. This means companies like Ouster and Sony can build massive SPAD arrays on silicon wafers, dramatically cutting costs while boosting resolution.

The Wavelength Dilemma: Why 1550nm LiDAR Can Destroy Your Phone

Here's where things get interesting, and slightly terrifying.

Most LiDAR systems operate at 905 nanometers, in the near-infrared spectrum. There's a problem: your eye focuses this light directly onto your retina. Crank up the power, and you risk literally burning someone's eye. Regulations keep these systems strictly power-limited.

The workaround? Switch to 1550 nanometers. This wavelength gets absorbed by the fluid in your eye before reaching the retina, allowing systems to blast out up to a million times more pulse energy while remaining "eye safe."

But there's a catch. That massive power output isn't kind to camera sensors. Both AEye and Luminar's systems (the latter found on vehicles like the Volvo EX90) have been documented destroying smartphone and professional camera sensors that happened to be in the line of fire. Point your iPhone at the wrong robotaxi, and you might end up with a permanently damaged sensor.

The "Solid State" Marketing Myth

Perhaps the industry's most persistent fiction is the term "solid state LiDAR." Manufacturers love slapping this label on their products because it implies reliability. No moving parts means nothing to break, right?

The reality is messier. Many LiDARs marketed as "solid state" are anything but.

Livox's rectangular sensors look the part but use Risley prisms, a pair of rotating optical elements that steer the beam. Luminar's systems employ galvanometers and polygonal mirrors. These are mechanical scanning systems dressed up in sleek, non-spinning enclosures.

Even Velodyne once marketed its externally spinning HDL-64 as "Solid-State Hybrid," a claim that stretches the definition past its breaking point.

True solid state, where absolutely nothing moves inside the sensor, remains largely aspirational. Optical phased arrays could deliver this eventually, but commercial success has proven elusive. Startups like Quanergy and Oryx Vision tried and failed.

The Calibration Problems Nobody Talks About

Even the most famous datasets have dirty secrets. The widely-used KITTI dataset, a benchmark for autonomous driving research, contains known calibration errors. Flat walls appear curved. Range offsets vary between laser beams. Researchers have had to manually correct these issues to achieve decent results.

Then there's "blooming," a phenomenon where highly reflective surfaces like road signs or retroreflectors overwhelm the detector, creating ghost objects where nothing exists. Imagine your self-driving car swerving to avoid a phantom obstacle because a highway sign was too shiny.

What's Next for LiDAR?

The industry continues pushing toward genuine solid-state solutions and better FMCW (Frequency Modulated Continuous Wave) systems that can measure not just distance but instantaneous velocity through the Doppler effect.

Companies like Baraja are experimenting with wavelength-tuning approaches that eliminate mechanical scanning in one dimension entirely. Sony's latest SPAD sensors are achieving unprecedented dynamic range.

The technology is maturing, but it's worth remembering that these are precision instruments operating in hostile environments: rain, dust, temperature extremes, and the occasional curious pedestrian pointing a smartphone directly at them. The gap between marketing claims and engineering reality remains wider than many consumers realize.

For now, the next time you see a robotaxi roll by, know that those innocent-looking sensor boxes are firing invisible laser pulses powerful enough to damage electronics, building 3D maps hundreds of times per second, and probably still have a few moving parts inside, no matter what the brochure says.

Tags:LiDARautonomous vehiclesself-driving carssensorsroboticsautomotive technologyWaymoOusterLuminarSPAD
Liza Chan

Liza Chan

AI & Emerging Tech Correspondent

Liza covers the rapidly evolving world of artificial intelligence, from breakthroughs in research labs to real-world applications reshaping industries. With a background in computer science and journalism, she translates complex technical developments into accessible insights for curious readers.

Related Articles

Stay Ahead of the AI Curve

Get the latest AI news, reviews, and deals delivered straight to your inbox. Join 100,000+ AI enthusiasts.

By subscribing, you agree to our Privacy Policy. Unsubscribe anytime.

How LiDAR Really Works (And the Industry Myths) | aiHola