Well, here’s a rather disconcerting claim to surface at last week’s CES. Jit Ray Chowdhury is an autonomous vehicle engineer at Ridecell who claimed that his $1,998 Sony camera was damaged by the LIDAR unit of demo autonomous car outfitted with AEye’s LIDAR units. Chowdhury said after after he snapped photos of a car at CES with the light detection and ranging units, he noticed that subsequent images taken by the same camera had two bright purple spots with horizontal and vertical lines emanating from the bright spots.
Suspecting that the camera sensor was wrecked, he covered the camera with a lens cap to verify and sure enough, the spots were there. Chowdhury concluded that the laser from the LIDAR units must have burned into the sensor when he took photos of the autonomous car. Following the report, AEye CEO Luis Dussan reached out to ars Technica, reassuring that AEye’s LIDAR technology poses no danger to human eyes.
However, he did not dismiss the possibility that AEye’s LIDAR could damage camera sensors as camera sensors are typically “1000x more sensitive to lasers than humans’ eyeballs,” and “occasionally, this can cause thermal damage to a camera’s focal plane array,” Dussan said. While we can breathe a sigh of relief that it does not effect humans’ eyes, there is a real concern here.
You see, LIDAR is essential to enable self-driving cars and so are conventional cameras, and herein lies the problem: if LIDAR units are damaging to regular camera sensors, it will also cause damage conventional cameras outfitted on self-driving cars, thus making them unable to perform what they are designed to do and that’s self-driving. So, it can be kind of dangerous if several self-driving cars are sharing the roads as they’d be destroying each other’s cameras and possibly, rendering them unable to sense accurately.
Needless to say, unable to sense accurately or as intended, it is an accident waiting to happen. It is not known if this camera sensor-destroying phenomenon applies to all LIDAR technology or it is an isolated to AEye’s LIDAR. Now, it is worthy to note that laser is known to damage camera sensors and this means LIDAR, which also uses lasers, does indeed pose some risks imaging sensors.
Dussan seems to suggest that it is a problem inherent to LIDAR technology which he suggests that it would take the “entire LIDAR and laser community” to address. However, one competitor, Ouster, begs to differ. Ouster’s CEO, Augus Pacala, proclaimed that Ouster’s sensors “are camera and eye safe.” Now, if that isn’t rubbing salt into the wound…
Then again, Pacala might have a point because, a couple of other companies have been testing LIDAR equipped cars on the road for over a year and during this time, countless people have taken image of these self-driving cars, but so far, no one has come forward with damaged camera sensors claim. So, it is suffice to say, LIDAR is not a risk to cameras, or is it, really?
Anyways, if you are concerned and interested to find out more about this odd development, you can read the full report HERE. Before you go, you might be interested in this laser damaging imaging sensor caught on camera in the video below.
Images: Jit Ray Chowdhury.
Source: ars Technica.