Bio-inspired sensor fusion

Research in this area is beginning to reveal several instances in which signals from different sense organs are combined to generate behavioural responses that are rapid, accurate and reliable. We believe that it is particularly useful to study biological principles of sensor fusion, because animals typically work with sensory signals and neural responses that are imprecise and noisy, and yet display behaviour that is surprisingly reliable. Our goal is to seek a better understanding of sensor fusion in animals (a) through the biological literature and (b) through experiments conducted in the ANU laboratories, and explore ways in which these principles could be used in designing better algorithms for sensor fusion in a variety of technical applications. The ANU laboratories, with their strong background in research in animal behaviour and neurophysiology, are ideally placed to interact with the engineering teams at the University of Melbourne.

Gaze control: In humans as well as many animals, the direction of gaze of the eyes is controlled by visual as well as auditory signals. A sharp or unexpected sound is just as effective in directing visual gaze as is a flash of light or the onset of movement in the environment. In mammals, the integration of visual and auditory information is believed to occur in the superior colliculus, which carries neurons that respond in a direction specific way to visual as well as auditory stimuli. These neurons are believed to play a key role in the brain's representation of visual and auditory space. Their task is to combine the visual and the auditory information to generate command signals that will direct the gaze of the eyes toward novel objects in the environment. Information available in the literature will be used to develop a quantitative model of sensory integration that could be employed, say, in the design of humanoid robots, or in a variety of engineering applications that require combination of information from two different sensory modalities.

Visual and infrared-driven orientation: In animals such as pythons, that are nocturnal as well as diurnal, visual signals from the eyes are supplemented by infrared signals detected by the so-called "pit organs" along the sides of the mouth. The integration of these two types of signal enables sensing and tracking of warm-blooded prey, such as mice, during the day as well as at night. We will mine the published literature on this topic to develop a model that characterizes the way in which visual and infrared information are combined to drive target tracking, and captures the way in which the relative weighting of the visual and infrared information varies with the time of day, as changes in the ambient illumination produce changes in the relative reliabilities of the two kinds of signal. The results should provide useful insights into how biological systems deal with signals of varying reliability, and provide a useful comparison with the traditional engineering approach based on Kalman filtering.

Insect flight control. In insects, stable flight and course control relies on the synergistic interaction of a number of different sensory signals:

  1. The so-called "optomotor" response uses visual information from the compound eyes to provide stabilization with respect to slow perturbations in yaw, roll and pitch. This is accomplished by using large-field neurons that are tuned to respond to the specific patterns of optic flow that are generated by yaw, roll and pitch.
  2. The ocelli, three additional light- sensitive organs that look forward, leftward and rightward, respectively, provide signals that are used for stabilization with respect to rapid perturbations in roll and pitch. Roll is stabilised by comparing the signals from the left and right ocelli, and pitch is stabilized by comparing the signal from the forward-looking ocellus with the mean of the signals from the left and the right ocelli.
  3. In flies, sensory organs called  halteres' function as miniature gyroscopes to sense rapid perturbations in roll, yaw and pitch. Signals from these organs supplement those provided by the compound eyes and ocelli. They carry the advantage of being able to provide stabilising signals even during flight in total darkness.
  4. The dorsally directed (upward-looking) regions of the compound eyes of many insects are equipped with specialized photoreceptors that are sensitive to the polarized light patterns that are created by the sun in the sky. These photoreceptors feed into polarization-sensitive interneurons that function as  celestial compasses', informing the insect about the direction in which it is flying in relation to the sky's polarization pattern. The polarization-sensitive system is used by insects to establish and maintain the correct heading direction whilst navigating toward a distant goal
  5. Recent studies are indicating that some insects also possess a magnetic sense that informs them of their heading direction, and helps them maintain it, as does the polarization system. In principle this magnetic sense could also be used to stabilize roll and pitch, although this possibility remains to be explored. Research in these areas of insect flight control and navigation is being actively pursued at the ANU. It will generate first-hand information on the strategies by which biological systems combine signals from a range of different sensory organs. It will also provide direct input into the design of autonomously navigating aerial vehicles.

Path integration: Flying insects (e.g. bees) and walking arthropods (e.g. crabs, ants) are able to navigate by "dead reckoning", a process whereby information on heading direction (obtained from the celestial compass) is combined with information on distance traveled to estimate an animal's position relative to its nest, in terms of distance and direction. In walking arthropods, distance travelled is estimated by monitoring leg movements. In flying insects, travel distance is estimated by integrating optic flow, and possibly through the use of additional cues such as energy consumption and flight duration. Current research at the ANU is directed at determining precisely how animals measure travel distance, and combine this with information on instantaneous heading to compute a "homing vector" that informs the insect as to the distance and direction of its nest. The results of this research should provide important information on sensor integration in this context, and generate novel approaches to the design of algorithms for path integration for the autonomous navigation of terrestrial and airborne vehicles.

Sensor fusion, pragmatic rules and the organization of behaviour: Fiddler crabs live in dense colonies on tropical mudflats and indulge in a rich social life during low tide. Each crab operates from its own burrow, which serves a variety of functions, including protection from predators, and thus is a valuable resource. Fiddler crabs home by path integration. Their behaviour is governed by a number of pragmatic rules which are shaped by the particular visual and social environment the crabs inhabit and which allow them to categorize significant events in their world. The crabs are also surprisingly flexible in their behavioural options. They are able to change their mating system and their colour signals depending on their assessment of predation risk and they integrate information from the path integration system and from their visual system in multiple and flexible ways to tackle a number of tasks, like homing, detouring obstacles, identifying their neighbours, and in burrow surveillance. Fiddler crabs are thus a powerful model system in which to study the flexible, robust and multi-sensory organization of complex behaviour.

Tracking, Fusion and Vision Systems
Future Directions


  1. Moving target detection by insects - figure/ground discrimination (dragonflies, hoverflies)
  2. Motion coding / adaptation (at least another 10 years basic physiology remains in this area even to complete present project directions): This is the source for algorithms that feed following projects
  3. Computer modeling of insect vision algorithms
  4. Natural image coding/ natural time series analysis

VLSI Robotic sensor fusion

  1. Adaptive motion chips (based on both of the above): Collaboration with Tanner Research Inc
  2. Spatial imagers (low pixel count imagers for feedback control systems)

Extension of engineering principles

  1. Low pixel count imagers inherently translate to non-visible EM spectrum: millimeter waves
  2. Passive IR detectors
  3. New concepts: 'Bio-robotics': hardware demonstration of capabilities of low-pixel count systems based on insect vision

Research Programs

  1. Electrophysiological recordings from insect neurons that detect and track moving targets and features
  2. Biomimetic modeling/algorithm development based on insect visual neurobiology.
  3. Neuromorphic analog VLSI.

Conventional vision systems based on mathematical algorithms tend to become very complicated and their hardware implementation requires no less than powerful main-frame computers to run in real time. Biological models of the insect visual system, however, suggest simpler solutions for constrained tasks, like motion detection. Insects are a model system because they display sophisticated flight control and yet are simple enough that we have been able to deduce a great deal about the underlying neural circuitry used for such tasks, using physiological techniques. The insect vision group at the University of Adelaide uses a truly cross-disciplinary approach to transfer ideas derived from studying insect physiology and behaviour to robust models in software and hardware.

This program is a world-first in that it seeks to combine a number of important areas:

On the VLSI side we have developed novel circuits for early visual processing. A few to be mentioned are: a current mode spatial averaing (CMSA), a multiplicative noise cancellation (MNC) circuit, a wide dynamic range current mode fusing resistive circuit, and a very low (10^-11 Siemens) transconductive element based on the Early effect. In many of these cases we have exploited subthreshold circuits, mainly to use the exponential relationship of the current-voltage and also to reduce the power dissipation. Our new patented approach uses an active impedance transformation to convert a voltage controlled grounded resistor into a floating resistor.

A major focus of our present work is the task of motion detection, for analysis of optic flow and estimation of the speed of moving targets and features. We have designed and implemented a series of analog VLSI chips based on an insect-inspired motion detection algorithm, the template model. Since the first chip (Bugeye I in 1992), we have designed several other chips, namely, Bugeye II, Bugeye III-1, Bugeye III-2, Bugeye IV, MNCSI (the first full implementation of shunting inhibition), and Bugeye V. We have used various processes from 2.0 to 0.8u for fabrication, all from MOSIS (simply because it is the best hassle free fabrication service, though a bit more expensive than some other options). In collaboration with Tanner Research Inc. in the USA , we are now developing new implementations based on an adaptive elaborated Reichardt model for a correlation-based motion detector, using the silicon on sapphire (SOS) process.

top of page