Can Insect Eyes Boost a Robot’s ‘Vision?’

Detecting and tracking small objects against complex backgrounds proves challenging.

Source: University of Adelaide

Researcher Zahra Bagheri and professor Benjamin Cazzolato with their robot. Source: University of Adelaide

Can insect-inspired ‘eyes’ boost a robot’s efficiency? That’s what a group of multidisciplinary researchers, led by a mechanical engineering PhD candidate, are trying to figure out. The team is currently building a device in the hopes of improving visual systems for robots. 

“Detecting and tracking small objects against complex backgrounds is a highly challenging task,” says Zahra Bagheri, a mechanical engineering PhD student from the University of Adelaide. She uses the example of a professional baseball player who has merely a few seconds to spot a ball, track it and then make a prediction about the ball’s path while he runs and copes with the distraction of spectators. 

Taking inspiration from dragon flies 

Bagheri is collaborating with University of Adelaide neuroscientist Dr. Steven Wiederman, who has found that dragon flies and other flying insects exhibit visually guided behavior. They’re capable of chasing down prey and are unfazed by distractions such as swarms of other insects, for instance. 

“They perform this task despite their low visual acuity and a tiny brain, around the size of a grain of rice. The dragonfly chases prey at speeds up to 60 km/h, capturing them with a success rate over 97 percent,” Bagheri says.  

Creating an algorithm 

To match an insect’s visual tracking capabilities, the team has created a unique algorithm. “Instead of just trying to keep the target perfectly centred on its field of view, our system locks on to the background and lets the target move against it,” Bagheri explains. “This reduces distractions from the background and gives time for underlying brain-like motion processing to work. It then makes small movements of its gaze and rotates towards the target to keep the target roughly frontal.”    

The group tested its bio-inspired “active vision” technology using virtual reality. “This type of performance can allow for real-time applications using quite simple processors,” says Wiederman. “We are currently transferring the algorithm to a hardware platform, a bio-inspired autonomous robot.” 

Wiederman and Bagheri recently published a paper the Journal of Royal Society Interface. For more information, visit the University of Adelaide’s website.