Seeing in the dark with Headlight AI

Harnessing the power of data through lidar, Headlight AI is developing a technology that will help robots see and operate in particularly harsh environments.


Robots today are quite smart. They can move around autonomously, avoid obstacles, and make sense of complex visual or auditory patterns.

When it comes to adverse conditions, however, their sensors can be fooled quite easily. High pressure or heavy weather may make robots’ calculations inaccurate, or completely unusable.

Headlight AI is working to solve this problem.

Founded by CEO Jameel Marafie and CTO Puneet Chhabra, the EF-backed and SHACK15 resident start-up is developing technologies that will allow robots to operate in GPS denied locations and in poor visibility, and to see ‘beyond 3D’, distinguishing between different materials to provide greater insights.

­­In an exclusive interview with SHACK15 (Outside Insight), Jameel and Puneet speak about Headlight AI’s goals and the road so far.

Why Headlight AI?

“We were looking for something that was simple, easy to remember, but also relevant to the business”, says Jameel. He came up with the idea waiting for a night bus in London. “It was dark and this bus’s headlights just flashed me, ” he says, smiling, “I know it sounds kind of cheesy, but that’s how it came.”

Explaining how this connects to their company, Jameel says that “everywhere where our technology is relevant, we’ll need some form of headlights.”

Since Headlight’s technology is mostly used in underground pitch-black locations, as well as in heavy-weather/poor visibility conditions, a light source will be needed to visualise what is going on.

As for the ‘AI’ part, Jameel believes artificial intelligence technology is not being used in a smart-enough way at the moment, when it comes to the integration with light sensors.

“We are working to bring artificial intelligence towards light-based sensing and that is why Headlight AI.”

The Technology

“What we are primarily doing is creating adaptive sensing and mapping software”, says Puneet. In other words, developing artificial intelligence software that will make sensors more efficient.

“Our primary focus is on environments where these sensors haven’t encountered before. Sensors on vehicles operating in foggy, smoky or snowy conditions for example.”

Puneet explains that existing sensors in these harsh environments don’t seem to work efficiently.

“We are creating AI which sits right at the ‘door level’. It talks to the sensors, allowing them to talk to each other and optimise depending on the environment they are in.

“There is a huge amount of data that is generated by the sensors, and we are able to handle it and process it in real time.”

Commercial Applications

“We are completely platform and sensor-agnostic”, Puneet explains, “that is where the real beauty of the AI comes in. It’s like an organism adapting to ever new environments”.

The applications of their technology are various, adds Jameel. Robotics, infrastructures management and autonomous vehicles on the road are just some of them.

“We are developing sensor technology. Now, those sensors can be set on a fixed platform that is monitoring somewhere; it can be on a ground-based robot that is going around, it can be on a drone, on a bus, on a train.”

Jameel says they are currently conducting trials within tunnels, creating 3D maps of places in pitch-black locations.

“We are also partnering with some sensor companies. As well as the end-user, we’re also looking at sensor partners. ”

 Competitors?

“Other companies are fusing the data coming from the sensors to create better maps”, says Puneet, “which works in optimal environmental conditions.” Their technology, however, doesn’t adapt to unknown conditions, he adds.

To adapt to new conditions, the sensors need external information, which is provided by the radar through their software, which is completely adaptable.

“There are a lot of companies developing software for robotics, in indoor environments. But the way the sensors change their behaviour, and doing that through AI has not yet been explored.

“And if you combine that with high-level mission planning, then you can have multiple vehicles, multiple robots, then it’s a very unique and interesting proposition that we are offering. ”

What is your vision?

“Any device and platform that needs a combination of sensors, they should be using our software,” says Jameel, “because they’re going to have much greater insights than any individual sensors taken singularly.

“We understand this is a very grand vision because sensors are everywhere and there’s going to be well over a hundred billions sensors in the next five years. They’re going to be everywhere in transport, infrastructure, constructions, security. Pretty much all the major industries.”

Puneet adds that these sensors are going to become the backbone for the next generation of the internet of things (IoT) industry and they’re going to be increasingly complicated. Because of this, software capable of managing multiple sensors at the same time will be key to future developments.

“If we can manage to embed our software into any system that needs to use lots of sensors, that’s going to be the realisation of our grand vision.”

What does the future hold?

At the moment, Jameel explains, Headlight AI is offering their software as a service, so they’re going to be looking for customers in that area.

“Within two years, we will have enough of our IP and patents generated that we will be starting IP licensing to different companies”, Jameel says.

As CTO and person in charge of recruiting, Puneet admits he would like to see the team expanding with a culture that mixes academia and a drive to bring technology to the world faster.

“We are looking for funding from investors and we are currently recruiting”, adds Jameel, “we want to be a team of around five by January, so we’re looking forward to hearing from anyone who would like to get involved!”

Recent Articles