Mapping/SLAM (JMI) and Localization (WAI) are key ingredients for autonomous vehicles to perform any task in real-world operations. Creating a map (SLAM) of the environment requires a skillset to get a global position similar to what a GPS provides. The technology works without any structural support or changing the operating area. Mapping large areas can be achieved in a few steps, and the localization outputs stable high accuracy results even in challenging situations. As localization depends on a well-calibrated sensor setup and meets client expectations of accuracy and speed, we have set a minimum configuration for specific tasks. The modular software supports a wide range of standard sensors and combinations.
Tasks: Indoor/Outdoor Localization in 2D (x, y, theta) or 3D (x, y, z, yaw, pitch, roll)
Sensors: LIDAR (Ouster, Velodyne, SICK, …), Depth Camera (Intel Realsense, Azure Kinect, …), Inertial Measurement Units
JMI – Mapping
Mapping is typicaly done with a SLAM approach by recording the operating area once and generating a map.
- Suitable for large and complex areas
- Very fast processing (faster then realtime)
- No post processing required
- Simple operation and no skills required
- Output are point cloud representations (x, y, z)
- Web Application Interface
WAI – Localization
The localization unit requires a predefined mapped area and outputs the position of the localization unit or in a desired vehicle coordinate frame.
- Standard accuracy with one LIDAR < 2 cm in position and < 1° in orientation
- High precision accuracy with multiple sensors (on demand)
- Performs in highly dynamic environments (changes above > 80%)
- Easy fusion of several sensors and types of sensors
- Interface via ROS, TCP, Modbus, CAN, REST, VDA5050
GET A QUOTE