The "Green Brain" project folds a honeybee's vision and sense of smell into a robot that will act autonomously
A team from the universities of Sheffield and Sussex want to create the first robot that can sense and act autonomously like a bee, rather than following a set of pre-programmed instructions, which could lead to devices that carry out the mechanical pollination of crops or even assist in search-and-rescue missions.
The $1.6 million "Green Brain" project will see the researchers produce the first accurate model of a bee’s brain, reproducing the systems that govern a honey bee’s vision and sense of smell in order to build a robot that can find the source of particular odours or gases in the same way that a bee can identify particular flowers.
Sheffield University’s Dr James Marshall, who is leading the EPSRC-funded project, told The Engineer: "The attraction of autonomous flying robots is getting into dangerous or inaccessible environments and they just need the autonomy to fly without human intervention. The applications are a long way away but you could imagine the robot following odor trails of sweat to locate survivors in a collapsed building."
The researchers want to model a bee’s brain because it is small but relatively sophisticated. Bees can learn how to associate rewards such as finding a source of nectar with following a particular odour trail and then reproduce that process with visual patterns.
"It’s very impressive that they can do it with just one million neurons and we’d like to know how," said Marshall.
The scientists will create the computer model based on information collected by partners at Paul Sabatier University in Toulouse, France, who will use a special camera to take images of the different neurons of a bee’s brain firing as it learns.
They will combine data on groups of neurons that activate as the bee senses a particular odour with information on the single neuron associated with receiving a sugar reward, piecing this together to create a picture of how the brain links the two.
In order to model the complex brain processes in real time, the researchers will use hardware provided by NVIDIA Corporation based on the graphic processing units (GPUs) that are used to produce 3D graphics for video games but also power some of the world’s highest-performance supercomputers.
These accelerators provide an efficient way of performing the massive calculations needed to simulate a brain on a standard desktop PC rather than on a large, expensive supercomputing cluster.
"Using NVIDIA’s massively parallel GPU accelerators for brain models is an important goal of the project as they allow us to build faster models than ever before," said Dr Thomas Nowotny, leader of the Sussex team.
"We expect that in many areas of science this technology will eventually replace the classic supercomputers we use today."
The Green Brain team believes that modelling of the bee brain will also improve scientific knowledge of how a brain’s cognitive systems work, leading to advances in understanding animal and human cognition.
‘Not only will this pave the way for many future advances in autonomous flying robots, but we also believe the computer modelling techniques we will be using will be widely useful to other brain modelling and computational neuroscience projects,’ said Nowotny.
The research could also lead to a greater understanding of honey bees, which are vital to many ecosystems because they pollinate plants but have seen massive population declines in recent years.
Scientists at Harvard University are pursuing a similar goal of building so-called ‘Robobees’ in order to improve the design of compact high-energy power sources, ultra-low-power computing and electronic "smart" sensors, as well as harnessing a greater understanding of the way a hive of bees works together.