WE MAKE SENSE
OUT OF SENSOR DATA

OUR ACTIVITY

We provide a suite of Intellectual Property, Algorithms and Engineering services to achieve mobile autonomy, advanced 3D perception and situation awareness.

Based in Paris, the company was created in 2015 by Raul Bravo and Olivier Garcia in order to extend to new application fields the result of 15 years of world-class know-how and field experience working on GPS-denied and infrastructure-less navigation.

The Dibotics’ technology is used as a core component of some of the most advanced products and companies in mobile robotics, operating in both indoor and outdoor environments.

Raul Bravo is a serial entrepreneur & start-up coach, with an extensive background in both bootstrapped & VC-backed start-up creation and growth.

Telecommunications Engineer from UPC (Barcelona, Spain) with an MBA from College des Ingénieurs (Paris, France), he’s obtained 12 different awards for his engineering and entrepreneur career, among them the «MIT Technology Review – Innovators Under 35».

Olivier Garcia is one of the most internationally renowned mobile robotics experts in the field of Localization and Navigation.

Industrial Engineer from UPC (Barcelona, Spain) with a Robotics Master’s Degree from Pierre et Marie Curie University (Paris, France), he has gained an extensive expertise and developed technology in different applications fields as 3D Mapping, Logistics, Defense & Security, Autonomous Car and Automated Passengers Vehicles.

We’ve developed and field-validated a unique 6 Degree of Freedom Sensor-Agnostic Localization technology, relying exclusively on the data originated from a single sensor: no odometry, IMU or multi-sensor fusion are needed.

3d perception

DATA ACQUISITION

Thanks to our Sensor-agnostic approach, we can provide a solution based on a wide range of sensors (Lidar, 3D Cameras, Sonar/Radar, etc.) depending on the application and customer needs.

LASER SCANNERS
2D lidar

2D LIDAR
(SICK, HOKUYO, PEPPERL FUCHS, OMRON…)

2D mulh-layer laser scanner

2D multi-layer laser scanner
(IBEO)

3D lidar

3D LIDAR
(VELODYNE, NEPTEC OPAL)

3D CAMERAS
Direct time of flight imagers

DIRECT TIME OF FLIGHT IMAGERS
(ASC)

Phase ship time flight imagers

PHASE SHIFT TIME OF FLIGHT IMAGERS
(MESA IMAGING, IFM, FOTONIC, SOFTKINETIC)

Stereo cameras

STEREO CAMERAS
(STEREOLABS, VISLAB, MULTISENSE)

SONAR/RADAR
2D sonars

2D SONARS
(BLUEVIEW)

3D sonars

3D SONARS
(BLUEVIEW, CODA
OCTOPUS ECHOSCOPE)


Localization LOCALIZATION

Our unique advanced 3D SLAM (6 DoF) technology provides an accurate real-time position over long distances without drift. A single sensor and infrastructure-free algorithm that delivers cheaper and simpler solutions.


perception PERCEPTION

Our perception algorithms integrate the data from the sensor over time, in order to build a 3D rich model of the environment that can then be analyzed in search of information of interest for the task at hand.


Navigation NAVIGATION

Field-proven navigation and control solution, which has been validated in real-world Outdoor and Indoor environments and multiple application fields.

USE CASES

Over the last 15 years, we’ve applied our technology to a wide range of application fields in several industry segments all over the world, to achieve fully autonomous function, perception-enriched tele-operation or 3D modelization.

SENSORS

3d-lidar-1.png3d-lidar-2.png

ALGORITHMS

localization.pngperception.png

Raw data from Lidar vs. Real-time 3D SLAM (i)

3D SLAM using a Velodyne Lidar, only the raw data from the ground is used for Localization and 3D mapping purposes

SENSORS

3d-lidar-1.png3d-lidar-2.png

ALGORITHMS

localization.pngperception.png

Raw Data from Lidar vs. Real-time 3D SLAM (ii)

This video shows the difference between using the Raw data from a 3D Velodyne Lidar and slamming this information over time using the real-time Dibotics 3D SLAM.

SENSORS

3d-lidar-1.png3d-lidar-2.png

ALGORITHMS

perception.png

Lidar Raw Data vs. DIBOTICS’ processed Data

Shows the difference between the perception got with a 3D Lidar and the results obtained in real-time when adding, to the same sensor and the same data, the Dibotic's SLAM Algorithm

SENSORS

3d-lidar-1.png

ALGORITHMS

localization.pngperception.png

3D mapping

San Jose, California, 3D city mapping. The Velodyne Lidar sensor has been setup with a 40 degrees inclination allowing for a higher scan field. The 3D Slam from Dibotics is able to work with this highly demanding setup.

SENSORS

3d-lidar-1.png

ALGORITHMS

localization.pngperception.png

Autonomous car on urban road

Another example of a self-driving car application on a urban environment using a HDL-32 Velodyne Lidar.  In partnership with Cadden (cadden.fr)

SENSORS

2d-mulh-layer-1.png3d-lidar-2.png

ALGORITHMS

localization.pngnavigation.pngperception.png

Autonomous Car (i)

High speed Autonomous Car driving example, using a single Velodyne Lidar sensor.

SENSORS

2d-mulh-layer-1.png2d-mulh-layer-2.png

ALGORITHMS

localization.pngnavigation.pngperception.png

Autonomous Truck convoy

The leading vehicle creates a real-time map, shared wirelessly with its followers. 3D slam algorithms of Dibotics allow the other vehicles to localize in the same reference and improve the map in real-time. The trucks can follow the leader without direct line-of-sight. No GPS, No IMU were used.

SENSORS

3d-lidar-2.pngstereo-cam-1.png

ALGORITHMS

localization.pngnavigation.pngperception.png

Localization in Agriculture (i)

3D SLAM in Agriculture fields, localization alongside trees. Localization and Perception in Agricultural fields using a VLP-16 Velodyne Lidar mounted on top of an all-terrain vehicle.

SENSORS

3d-lidar-1.png3d-lidar-2.png

ALGORITHMS

localization.pngperception.png

Automatic Harbor monitoring

Surveillance and observation of ships and boats at port using advanced SLAM algorithms for perception, with a full panoramic view during night and day.

SENSORS

3d-cam-direct-time-flight-1.png3d-lidar-2.png

ALGORITHMS

localization.pngperception.png

Airborne Mapping using Lidar

Autonomous UAV, 3D Mapping without drift, using a Drone and a Velodyne Lidar with Dibotics 6DOF Slam algorithms in an Urban environment. Thanks to a close collaboration between Dibotics and XactSense (www.xactSense.com).

SENSORS

3d-lidar-2.pngstereo-cam-1.png

ALGORITHMS

localization.pngperception.png

Drone Security Zone

Airborne security using a Drone and Lidar thanks to Localization and Perception with 6 DOF SLAM. Thanks to a close collaboration between Dibotics and XactSense (www.xactSense.com).

SENSORS

3d-lidar-2.pngstereo-cam-1.png

ALGORITHMS

localization.pngnavigation.pngperception.png

Localization in Agriculture (ii)

A second example of robust Outdoor Localization in Agricultural Fields using our 6 Degrees of Freedom SLAM algorithm and a Velodyne multi-layer Lidar, without any additional sensor.

SENSORS

phase-ship-time-flight-4.png

ALGORITHMS

localization.png

3D ToF Camera SLAM

This video shows a short indoor sequence of a FOTONIC E70 TOF camera slammed with 6DOF. No other sensor was used.

SENSORS

2d-lidar-1.png

ALGORITHMS

localization.png

Superresolution Filter

Another example of SLAM based 3D superresolution with a FOTONIC camera. The algorithms, in this case, are used to localize the sensor position: thanks to the 6DOF perception algorithms, the acquired data is integrated to provide a higher definition 3D model.

SENSORS

phase-ship-time-flight-4.png

ALGORITHMS

localization.png

Slam on warehouse ceiling

Using a FOTONIC ToF camera to SLAM on the ceiling of a Warehouse, using no additional sensor or odometry.

SENSORS

3d-lidar-1.png3d-lidar-2.png

ALGORITHMS

localization.png

Driverless car, SLAM on fields

Autonomous Car using a single Lidar sensor, no IMU and no GPS were used. The 6DOF Algorithm slams using the only data acquired from the fields surrounding the car.

SENSORS

2d-mulh-layer-2.png3d-lidar-1.png

ALGORITHMS

localization.pngnavigation.pngperception.png

Autonomous Shuttle

Autonomous Shuttle for Outdoor people transportation, using a single multi-Layer Sensor.

SENSORS

3d-lidar-1.png3d-lidar-2.png

ALGORITHMS

localization.png

Autonomous Car (ii)

A second example of a self-driving car project using real-time 3D Slam algorithm

SENSORS

2d-mulh-layer-1.png3d-lidar-1.png

ALGORITHMS

localization.pngperception.png

3D Real-Time Dock Scanning

3D Scanning of a Dock from a boat using our SLAM Localization and Perception algorithms. The Dibotics unique algorithms overlook the movement of the waves in the water.

SENSORS

3d-cam-direct-time-flight-1.png3d-lidar-2.png

ALGORITHMS

localization.pngperception.png

Infrastructure Surveillance with Drone

Inspection of bridge infrastructure using a Drone and a Lidar. Thanks to a close collaboration between Dibotics and XactSense (www.xactSense.com).

Raw data from Lidar vs. Real-time 3D SLAM (i)

Raw Data from Lidar vs. Real-time 3D SLAM (ii)

Lidar Raw Data vs. DIBOTICS’ processed Data

3D mapping

Autonomous car on urban road

Autonomous Car (i)

Autonomous Truck convoy

Localization in Agriculture (i)

Automatic Harbor monitoring

Airborne Mapping using Lidar

Drone Security Zone

Localization in Agriculture (ii)

3D ToF Camera SLAM

Superresolution Filter

Slam on warehouse ceiling

Driverless car, SLAM on fields

Autonomous Shuttle

Autonomous Car (ii)

3D Real-Time Dock Scanning

Infrastructure Surveillance with Drone

PARTNER WITH DIBOTICS

  • DIBOTICS
  • 96 Bis Boulevard Raspail
  • 75006 Paris
  • France

We work exclusively with OEM’s through long-term licensing and knowledge transfer agreements, helping them to enhance their current tele-operation capabilities and to develop advancements in vehicle automation and 3D Perception.

Let us know how we can help:

First Name *

Last Name *

Company *

Email *

Phone *

Project details *


Please leave this field empty.