From EVs and batteries to autonomous vehicles and urban transport, we cover what actually matters. Delivered to your inbox weekly.

This Startup Gives Autonomous Vehicles Superhuman Vision

Share your love

Autonomous vehicles perform well – until the environment turns against them.

Fog, dust, snow, or bright sunlight can throw traditional sensors off. LiDAR scatters in particles. Cameras struggle with glare. Radar lacks the detail to pick up small obstacles. When this happens, autonomy doesn’t just slow down. It fails.

That’s the harsh reality for mining fleets, military vehicles, and port operators, where tough conditions often force humans to take over.

The issue isn’t the AI making decisions, but the sensors feeding it. Without reliable vision in messy real-world environments, autonomy can’t scale.

A single vision system for every environment

Meet LEO, the all-weather sensor from Autonomous Knight.

Instead of using separate LiDAR, radar, and thermal units, LEO combines visible, infrared, and thermal imaging into one clean video stream.

It works in tough conditions (fog, smoke, dust, snow, or darkness) where most sensors struggle or fail. And because it runs fusion directly at the edge, it gives you sharp, real-time awareness without overloading your system.

AI illustration

A complete developer stack for perception and control

LEO delivers more than hardware. It’s a complete perception platform.

Autonomous Knight pairs the sensor with a full software stack – a developer-friendly SDK, real-time sensor fusion software, and an AI simulation suite built to handle edge cases in messy environments.

It works with open APIs, connects easily to most autonomy stacks – from land vehicles to marine systems – and is hardware-agnostic, so teams can integrate it without overhauling existing setups.

From sensing to simulation, it’s everything needed to build autonomy that works in even the toughest real-world environments.

Why heavy industries are the first to use LEO

Autonomous Knight isn’t focused on robotaxis and fleets (for now, at least). It’s building primarily for the industries that need autonomy right now.

Think mining, maritime, agriculture, and defense, all industries where weather, dust, and danger are part of daily operations.

In mines, haul trucks move through thick dust clouds that blind cameras and scatter LiDAR. On ships, fog, rain, and glare make visibility nearly impossible. And keep in mind, these are everyday challenges.

LEO handles them by combining radar, thermal imaging, and AI into a single, rugged perception system. It keeps machines running around the clock, even when other sensors can’t.

Ships trapped in fog have blinded cameras and scattered LiDAR. Image source: Elena G., Unsplash

Autonomy veterans with global backing

Autonomous Knight isn’t your average startup. It’s led by veterans who know autonomy and backed by global names that know how to scale it.

  • Noël Jans has deep experience in mining automation and field operations.
  • Geoffrey Ejzenberg has led logistics projects across multiple continents.
  • Sacha Vrazic, formerly at Rimac, helped shape cutting-edge ADAS and self-driving systems.

On the partner side, big players are already involved. Volvo Group brought the company into its CampX accelerator. Support from EIT Urban Mobility, Plug and Play, and Berkeley’s SkyDeck adds global reach and technical depth.

With early EU innovation grants, photonics awards, and active pilots in the field, Autonomous Knight is already gaining notable traction.

Licensing smart vision to the autonomy world

The company isn’t selling another sensor or chasing consumer apps. It’s offering something more foundational – a smart vision layer that plugs into any autonomy system.

Here’s how the model works:

  • SaaS + SDK licensing – OEMs, fleet operators, and robotics developers can license Autonomous Knight’s software and tools to embed its multispectral perception directly into their own platforms.
  • Hardware + software bundle – Autonomous Knight ships its own integrated sensor, replacing bulky multi-sensor rigs with a single, compact module that sees across visible, infrared, and thermal bands.
  • No consumer play – This isn’t a dashboard app. It’s the behind-the-scenes tech that makes autonomy safer and more reliable.
  • Modular and open – Autonomous Knight’s system is hardware-agnostic and API-ready, so it can work with existing autonomy stacks, from small robots to industrial vehicles.

In short, Autonomous Knight isn’t selling sensors, but providing visibility that systems can trust, and the intelligence to act on it.

Why perception is the next big barrier

In autonomy, the biggest challenge isn’t maps or driving logic, but how well a machine can see.

Most systems still falter in tough conditions like fog, glare, snow, or deep shadows. And when sensors fail, so does safety. That’s why multispectral fusion that combines visual, thermal, and radar data is becoming a key advantage.

Autonomous Knight fills that gap. Its tech gives vehicles a clearer picture in real-world environments, helping them make smarter decisions, avoid mistakes, and stay operational when others can’t.

Whether it’s a delivery robot on icy streets or a safety-critical vehicle in low light, better perception is essential. And that’s exactly what they are building.

Example of hazardous conditions in which traditional sensors can malfunction. Image source: Kyle Loftus, Unsplash

Scaling multispectral vision

Autonomous Knight is moving from testing to deployment.

With a seed round underway, the team is focused on scaling its multispectral vision system from proven prototypes to live pilots. Early rollouts are already happening in mining, autonomous shipping, and with potential OEM partners.

And the bigger goal? We can only guess right now, but we see a clear ambition to become the invisible engine behind autonomy, a plug-and-play vision layer that works in any robot, truck, or drone, no matter the conditions.

Share your love
Filip Bubalo
Filip Bubalo

Researcher & writer for Charging Stack. Marketing manager at PROTOTYP where I help mobility companies tell better stories. Writing about the shift to electric vehicles, micromobility, and how cities are changing — with a mix of data, storytelling, and curiosity. My goal? Cut through the hype, make things clearer, and spotlight what actually works.

Articles: 169

Leave a Reply

Your email address will not be published. Required fields are marked *

Stay informedaheadsharpcuriousskepticalcritical.
Subscribe to Charging Stack ⚡️