At the University of Adelaide in Australia, researchers are working on developing a new target tracking system for robots, based on insect eyes. Tracking a small, moving object, such as a baseball or a dragonfly’s prey, can be extremely difficult, especially against varied backgrounds. Doing so requires that the player or hunter keep track of the object, which is hard enough to see, and also judge where it’s going, and how to intercept the object.
All of this happens in a few seconds. And in the case of a creature like a dragonfly, it has to be processed by a brain the size of a grain of rice. For the researchers, that posed a problem, but if they could fix it, they could help develop much more accurate robots and AI programs.
What they ended up doing was programing a system that avoids background distractions by focusing not on the object of pursuit, but on the background itself. The system focuses on the background, and is then able to keep the object of pursuit roughly in the center of its vision. Doing this, instead of focusing on the target, reduces distractions for the robot. Essentially, the object is the distraction, not the background.
So far, they have tested it within virtual worlds that create the kind of complex, changing backgrounds you find in the wild or in a ballpark. It’s been quite effective too, working about as well as state-of-the-art targeting systems, but about 20% faster. It also requires less powerful processors, something that will be helpful in installing such a system in robots, keeping them lighter and, likely, cheaper.
The next step is to install the system into actual hardware. The eventual recipient of this new software will be a “bio-inspired, autonomous robot.” The robot can then be tested in the real world, to see how the system holds up, and what improvements need to be made.