Scroll to top

Localization

Mapping (JMI) and Localization (WAI) are key incredients for autonomous vehicles to perform any task in real world operations. Creating a map of the environment is a required skillset to get a global position similar to what a GPS provides. The technology works without any structural support or changing the operating area. Mapping large areas can be achived in a few steps and the localization outputs stable high accuracy results even in challenging situations. As localization depends on a well calibrated sensor setup and to meet client expectation of accuracy and speed we have set a minimum configuration for specific tasks. The modular software supports a wide range of common sensors and combinations.

Tasks: Indoor/Outdoor Localization in 2D (x, y, theta) or 3D (x, y, z, yaw, pitch, roll)

Sensors: LIDAR (Ouster, Velodyne, SICK, …), Depth Camera (Intel Realsense, Azure Kinect, …), Inertial Measurement Units

JMI – Mapping

Mapping is typicaly done with an SLAM approach by recording the operating area once and generating a map.

  • Suitable for large and complex areas
  • Very fast processing (faster then realtime)
  • No post processing required
  • Simple operation and no skills required
  • Output are point cloud representations (x, y, z)
  • Web Application interface

WAI – Localization

The localization unit requires a predefined mapped area and outputs the position of the localization unit or in a desired vehicle coordinate frame.

  • Standard accuracy with one LIDAR < 2 cm in position and < 1° in orientation
  • High precision accuracy with multiple sensors (on demand)
  • Performs in highly dynamic environments (changes above > 80%)
  • Easy fusion of several sensors and types of sensors
  • Interface via ROS, TCP, Modbus, CAN, REST, VDA5050

GET A QUOTE

Interested?
Let's talk.

Our highly skilled team
We use cookies to give you the best experience.