RTX47CXA
MIT's new tech could help car drivers see people or objects hidden behind dense fog. REUTERS/Clodagh Kilcoyne

Driving in foggy conditions is a major a cause of accidents during chilly seasons. Humans cannot see beyond a few meters and end up driving into a person or another vehicle shrouded in mist.

The same goes for autonomous vehicles that combine sensor-based data with live image and video feeds and fail to determine the object sitting ahead on many occasions.

However, researchers at the Massachusetts Institute of Technology (MIT) may have found a permanent solution, a new laser-based imaging tech, to deal with the pressing problem — be it a normal or self-driving car.

The system fires short laser pulses from a sophisticated camera and measures the time they take to return to its sensor. This determines how far the object really is. It may sound quite simple, but on the inside, the whole thing involves a lot of processing.

Lightwaves from a laser work in a weird way. On a clear day, they bounce back accurately, but when there is fog all around, the journey is disrupted by falling water droplets in the air. As a result, light scatters all around and reaches at different times, affecting the whole idea of distance measurement.

“We’re dealing with realistic fog, which is dense, dynamic, and heterogeneous,” Guy Satat, one of the researchers involved in the project, said in a statement. “It is constantly moving and changing, with patches of denser or less-dense fog.”

However, the group soon figured that no matter how dense the fog is, its return always adheres to a complex but very specific distribution pattern. With that idea, they developed a processing algorithm that combines the said pattern with a graphic representation of returning light particles (captured every trillionth of a second). The technique filters out data spikes and immediately reveals the hidden object.

“What’s nice about this is that it’s pretty simple,” Satat added. “If you look at the computation and the method, it’s surprisingly not complex. We also don’t need any prior knowledge about the fog and its density, which helps it to work in a wide range of fog conditions.”

When put to test in a meter long chamber filled with thick artificially produced fog, the system was able to see objects that the naked human eye wasn’t able to gauge. Specifically, human vision couldn’t peer beyond 36 centimeters but the new technology produced imagery of the object sitting as far as 57 centimeters behind.

Interestingly, in real life, fog doesn’t get that dense and offers 30-50 meters of on-road visibility. This means if the system is integrated into a normal or self-driving vehicle, it would easily discern fog-shrouded objects from far enough, allowing sufficient time for the driver to avoid a mishap.

The work on the novel system has been described in a paper set to be presented at the International Conference on Computational Photography in May.