“NavAbility’s platform was critical for us to rapidly go from robotics idea to proof-of-concept demonstration. A small engineering team was able to show customers a working solution in six months, saving us years of up-front development costs.”
– Friedl Swanepoel (CEO Industri4, CTO Ilima)
The Built Environment, although established, is ripe for automation. Any automation technology that supports project execution must be robust. Whether it’s a digital twin or in-situ operations of robotic equipment, how do you ensure that the fundamental task of localization and mapping works — and works reliably in such a dynamic environment?
Going further, how could you discern and use the changes captured in raw data as actionable information?
Industri4 addressed this relevant problem in real-world robotics and turned a problem into a strength. By making use of the natively multi-modal (i.e. non-Gaussian) solver in the NavAbility platform, Industri4 was able to not only compensate for changes in the environment, they were able to use the data to identify discrepancies between the as-designed blueprints and the as-built model with an autonomous vehicle.
Furthermore, daily interactions with the NavAbility team accelerated the development of their product, and the teams produced a fully operational demonstration for a key customer in the span of a few weeks.
- Rapid development and faster time to market using the NavAbility platform
- Unprecedented robustness delivered using the natively multi-modal (non-Gaussian) core solver algorithm
- Flexible integration of disparate sources of information with the open APIs
- An online hosted solution for production implementations, online visualizations, and high assurance operations
In a construction environment, there are multiple data assets or opportunities available. The Built Environment today relies on both digital/virtual assets which should be useful to robotic mapping and manipulation — but the technical challenges can be considerable. In addition, projects today are heavily reliant on human operated, and increasingly robotic platforms, to collect and execute on volumes of data.
Recorded and prior data contains many features of temporary or permanent changes. Some of the data overlaps with previous recordings of the same environment, while some data is of new environments entirely. The recorded data is captured from different sensor platforms and different sensing technologies, or by human input. Some data is based on laser ranging, structured light, while other is passive camera images in optical or thermal spectrum; further data includes perhaps magnetics or odometry distance measurements. Yet more data may include ground penetrating radar, gyroscopes, accelerometers, or inclinometers. Some situations may even include synthetic aperture radar or acoustic data.
The problem is how to leverage all this data in a digital format, either for developing a digital twin, or developing multiple ad-hoc digital map datasets of a construction site/situation. And, how to use this same map to navigation multiple real-time robots in and around the site, using a shared frame of reference.
The NavAbility localization and mapping approach focuses on the central problem of solving (doing inference on) heterogeneous factor graphs, and to the best of our knowledge is the first serious non-Gaussian solution. Our design philosophy considers both i) batch processing workloads, which may now or in the future be simultaneously useful to ii) real-time robotic platform navigation and mapping. Our approach also combines lessons from operations that are based around human operated sensor platforms or more automated robotic equipment.
Consider the selected non-Gaussian result shown in Figure 1. This is a simultaneous localization and mapping (SLAM) result where the posterior estimates of poses and landmarks can have multi-modal belief. NavAbility has developed Navigation-Affordances as virtual/digital representations (or assets) which allow the user to inject prior data or human knowledge and experience into the live navigation and mapping computation with minimal risk of “breaking” the SLAM solution when discrepancies or variations in the data occur. This fundamental duality is resolved using our unique multi-modal (non-Gaussian) factor graph formulation and navigation AI solver technology.
Figure 1: Solved (inferred) results in construction automation. The straight-edge cyan lines indicate a zoomed-in portion of an “as-designed” floor plan, while the straight-edge red lines show the “as-built” construction instead. The blue and red contour density lines indicate simultaneous belief state results on landmark variables present in the SLAM solution. The top illustrations shows convergence to the user-provided Navigation-Affordance floorplan, while the bottom illustrations show convergence to both the as-designed and as-built structures. Convergence is good. The duality produced by the discrepancy can be clearly identified from the posterior belief estimate result on the bottom right (bi-modal red contour densities).
Curious about technical details? Want to try the code yourself? Great!
** © 2021 IEEE. Personal use of this material is permitted. Permission from IEEE must be
obtained for all other uses, in any current or future media, including reprinting /
republishing this material for advertising or promotional purposes, creating new collective
works, for resale or redistribution to servers or lists, or reuse of any copyrighted
component of this work in other works.
Copyright ©2021 NavAbility LLC.