Duke engineers empower robots to navigate tough terrain with human-like senses

Industrial technology concept. Factory automation. INDUSTRY 4.0
image : ©metamorworks | iStock

Thanks to a breakthrough by Duke University engineers, robots are taking a major step forward in navigating the real world

A new AI system, WildFusion, is helping robots tackle unpredictable outdoor environments by combining multiple sensory inputs, just like humans use their senses to walk through a forest or climb over rocky terrain.

Moving beyond sight

Until now, most robots have relied heavily on vision-based systems such as cameras or LiDAR to map and move through their surroundings. These technologies are excellent in clear, predictable environments but often struggle when paths are obstructed, visibility is low, or terrain is irregular.

WildFusion changes this by integrating vision, sound, touch, and movement data. It’s built into a four-legged robot equipped with an RGB camera, LiDAR, inertial sensors, contact microphones, and tactile sensors. This diverse mix of tools allows the robot to gather rich data from the environment around and beneath it.

Understanding the terrain as humans do

As the robot walks, the contact microphones pick up vibrations from each step, like the crunch of leaves or the squish of mud. Tactile sensors monitor how much force each foot applies to the ground, giving the robot real-time feedback on whether it’s on a stable surface or slipping.

The inertial sensors track the robot’s motion to detect if it’s tilting, shaking, or wobbling, similar to how humans use their sense of balance.

All this sensory input is processed using advanced deep-learning techniques. Instead of treating the terrain as a collection of isolated points, WildFusion builds a continuous environment model. This helps the robot “fill in the blanks” when parts of the scene are missing or unclear, allowing smarter, more adaptive movement through complex terrain.

Real-world success

The WildFusion system was recently tested at Eno River State Park in North Carolina. The robot navigated dense forests, open grasslands, and rocky paths without needing a clear trail. It could predict which areas were safe to step on, even when visual information alone would have failed.

This kind of real-world performance marks a significant advancement in robotic navigation. It shows that robots can now operate more like humans, using a mix of sensory information to understand and respond to a constantly changing world.

A platform for the future

With its modular design, WildFusion can support even more future sensors, such as thermal or humidity detectors. These additions could help robots perform various tasks, from disaster response and remote inspection to autonomous exploration in extreme or unpredictable environments.

The Duke team sees this as a critical step toward building not just technically advanced robots but truly capable of adapting to our messy, complex world. WildFusion brings us closer to that future by giving robots a richer, more human-like perception of their surroundings.

OAG Webinar

LEAVE A REPLY

Please enter your comment!
Please enter your name here