ROBOT POSITIONING AND TRACKING WITH VISUAL SLAM
Robot Tracking and Localization is now possible using Dragonfly
This Visual SLAM technology allows you to monitor the location of robots, AGV, drones, using just the on-board camera. No hardware deployment required. No LiDAR.
Dragonfly, our Visual SLAM (vSLAM) technology, lets you get real-time 3D location of a robot.
Dragonfly can leverage the on-board hardware and work entirely offline. It can work inside known and unknown environments and does not require the deployment of specific hardware.
How does Dragonfly for Robot Positioning and Tracking work?
Dragonfly is a Visual SLAM technology that provides very precise location. Dragonfly calculates the location using Visual SLAM, using just the on-board camera. No sensor fusion, no MEMS, no hardware to be deployed on site, no QR codes. Dragonfly needs a 640×480 px video stream. The camera can be monocular (ideally wide angle) or stereo and it can point towards any direction.
An alternative to LiDAR
Dragonfly is the best alternative to LiDAR.
1. Price: a good LiDAR costs much more than Dragonfly.
2. Installation + calibration: the installation of a LiDAR is a very technical procedure while Dragonfly can be installed by anyone.
3. Damage: LiDARs are quite fragile and sensitive, everything works well until they end up against something. It takes very little to break a LiDAR or put it off axis. Dragonfly works using the on-board camera and as long as the camera works, Dragonfly can work.
- Dragonfly allows to monitor the real time location of the robots from the Onit dashboard.
- The API can be used to integrate the location data into external applications.
- ROS Nodes can be developed by Onit’s team to ease the integration with ROS.
Are you ready to get started?
Real-time 3D position (6 DOF).
Less than 5 cm accuracy.
Venue and place recognition and re-location.
Map shared among different devices.
Offline version available, no internet connection required.