Have you ever wondered how insects are able to navigate beyond their immediate surroundings and still find their way back home? This inquiry is not exclusively tied to biology, rather also involves developing AI for small, autonomous robots. Researchers at TU Delft have been inspired by the remarkable visual navigation skills of ants, which they observed as ants use visual cues to recognize their surroundings and combine this information with step-counting to safely return to their colonies. Scientists have leveraged these findings to develop an innovative autonomous navigation system for miniature, lightweight robots, drawing inspiration from the natural world’s insects. This energy-efficient navigation technique enables robots to return home from extended journeys, necessitating minimal computational resources and memory usage – a mere 0.65 kilobyte of data storage per 100 meters travelled. As miniaturized artificial intelligence takes hold, tiny autonomous robots will likely uncover a plethora of applications, ranging from tracking inventory levels in warehouses to detecting concealed fuel leaks on industrial premises. The researchers publicly shared their findings on July 17, 2024.
With their diminutive size, ranging from mere ounces to a few hundred grams, tiny robots hold significant promise for practical applications that captivate audiences worldwide. Given their light weight, they are surprisingly well-protected from harm, even if they were to accidentally encounter someone. Given their compact size, they are well-suited to navigate through tight spaces. When manufactured inexpensively, these devices are frequently deployed in larger quantities to quickly cover a vast area, such as in greenhouses for rapid pest or disease detection purposes.
Despite these challenges, creating autonomous micro-robots remains a significant hurdle due to their inherently limited resources compared to larger counterparts. A significant obstacle is that individuals must possess the capacity to travel independently. Robots can potentially receive support from external infrastructure. They utilise location estimates derived from GPS satellites in outdoor environments and Wi-Fi communication beacons within indoor settings. Despite this, it’s rarely exciting to rely on such infrastructure. GPS signals are often unreliable indoors and can become significantly less accurate in densely populated areas such as urban city canyons. Installing and maintaining beacons indoors is often prohibitively expensive or impractical, as evidenced by the challenges posed in search-and-rescue scenarios.
The development of an artificial intelligence system capable of autonomous navigation using only onboard assets has been designed primarily with large robots in mind, such as self-driving cars. Certain methodologies rely heavily on energy-intensive sensors, such as LiDAR laser rangefinders, which may be impractical for miniature robots due to size and power constraints. Diverse methods leverage the capabilities of imaginative and perceptive sensors, which provide rich, high-fidelity data on the environment while minimizing energy consumption. While these methods often strive to craft highly intricate three-dimensional depictions of the environment. The complexity of these computations necessitates a substantial processing capacity and memory, which can realistically only be provided by large-scale computer systems that would likely consume excessive power, rendering them unsuitable for miniaturized robots.
Researchers have thus turned to nature’s vast array of examples and phenomena as a source of innovative solutions and design principles. Despite their tiny size, bugs’ remarkable capabilities become even more impressive when considering they can operate over extensive ranges using minimal resources, much like many everyday applications require efficient use of limited sensing and processing power. Researchers in biology are increasingly gaining insight into the fundamental techniques employed by insects. In particular, insects combine precise tracking of their own movement, known as “odometry,” with visually guided behaviors, leveraging their low-resolution yet nearly all-encompassing visual system, dubbed “visual memory.” While significant progress has been made in understanding odometry, including its neural correlates, the precise mechanisms driving view reminiscence remain poorly understood. One of the most enduring early theories posits a “snapshot” model for understanding this phenomenon. A tiny, ant-like insect occasionally pauses from its daily routine to capture fleeting moments in miniature photographs of its surroundings. As the insect approaches the snapshot, it assesses its current visual perception of the image and adapts to minimize discrepancies. By enabling the insect to navigate directly to a designated location, this feature eliminates the need for compensatory drifting that arises from relying solely on odometry.
While snapshot-based navigation bears some resemblance to Hansel’s attempts to avoid getting lost in the classic tale of Hansel and Gretel, a more apt analogy might be exploring the navigational strategies employed by these characters as they traversed the dense forest, marking their path with breadcrumbs to ensure a successful return. When Hans throws stones from the top, he may well end up getting stuck dwelling. Nonetheless, when Hans and Gretel scattered the bread crumbs they had fed to the birds, they quickly found themselves lost. According to Tom van Dijk, the pioneer behind this research, “In our instance, snapshots equate to stones.” He notes that just as a stone requires proximity to function effectively, a snapshot in this context demands that the robot is situated close enough to its intended target. If the visible environment significantly diverges from the original snapshot location, the robot may mistakenly navigate an unsuitable path, potentially becoming permanently lost. To avoid getting lost, one must take enough photographs or, as exemplified by Hansel, scatter a substantial number of pebbles. However, repeatedly dropping stones to silence each other would quickly exhaust Hans’ supply of stones as well. When using a robotic system, excessive snapshot utilization can lead to significant memory consumption. Previous research typically presented snapshots in a tightly grouped format, allowing robots to visually navigate from one image to the next.
The fundamental assumption guiding our method is that you can store snapshots further apart if the robot moves between them based on odometry, says Guido de Croon, Full Professor in bio-inspired drones and co-author of the article “Homing will be effective as long as the robot ends up close enough to the snapshot location, i.e., as long as the robot’s odometry drift stays within the snapshot’s boundaries. This also enables the robot to travel much farther, since it flies significantly slower when homing in on a snapshot than when flying between them based on odometry.”
Researchers successfully employed an insect-inspired navigation method, enabling a 56-gram “CrazyFlie” drone equipped with an omnidirectional camera to traverse distances of up to 100 meters using only 0.65 kilobytes of data. All processing took place on a compact microcontroller, commonly found in numerous affordable electronic devices.
“The development of an insect-inspired navigation technique marks a crucial milestone in enabling the effective deployment of small-scale autonomous robots in real-world applications,” notes Guido de Croon. “Compared to current state-of-the-art approaches, this proposed method demonstrates significantly improved performance.” The program fails to create a map and merely allows the robot to return to the starting location repeatedly. Despite these limitations, this approach can still prove satisfactory for many purposes. With precision-engineered flight paths and sophisticated sensors on board, drones effortlessly soar above warehouses and greenhouses, gathering valuable insights that facilitate real-time inventory tracking and crop monitoring, before returning to their designated stations with ease. They may store mission-relevant images on a compact SD card for subsequent processing by a remote server. Despite this, they wouldn’t desire them primarily for navigation purposes themselves.