Rover Navigation

NGC is actively developing navigation software systems intended to increase the autonomy of planetary exploration rovers. This involves the design of onboard state-determination algorithms and image-processing software for autonomous moving platforms.

Autonomy is key to improving the efficiency of planetary exploration rover missions. In order to address this, NGC is actively developing onboard state estimation and image processing solutions to enable autonomous rover navigation.

NGC’s rover vision-based navigation system uses the following technologies:

  • Image processing for stereo vision
  • Visual odometry
  • Sensor fusion
  • State estimation
  • Sensor calibration

NGC’s Vision-Based Navigation system for rovers has been deployed onboard the Artemis Jr Moon exploration prototype rover. It also successfully supported the field deployment campaign for the NASA RESOLVE experiment , whose primary goal is to characterize water and other volatiles in the lunar regolith.

NGC also developed a complementary Lidar-based localisation system that reconstructs surrounding terrain topography and matches it to a reference Digital Elevation Map (DEM) of the area. This system was also field-demonstrated as part of the Artemis Jr development activities.

NGC is currently part of the Canadensys CSA Lunar Rover Mission (LRM) team. The team has been mandated by CSA to design a microrover concept which will explore a polar region of the Moon in 2026.


Rover Motion detection from image processing

Rover Absolute Position Determination System


Test and Validation Environments


When you work with us, you’ll find we are very flexible when it comes to finding ways to integrate our technology into your project. We also provide in-depth training and the support of one of the most experienced Guidance, Navigation and Control (GNC) systems teams in the world.