ABSTRACT

This Chapter explores alternative strategies for vision-based navigation that meet the constraints of ultra-light flying robots: few computational resources, very simple sensors, and complex dynamics. A genetic algorithm is used to evolve artificial neural networks that map sensory signals into motor commands. A simple neural network model has been developed, which fits the limited processing power of our lightweight robots and ensures realtime capability. The same sensory modalities as in Chapter 6 were used, whereas information processing strategies and behaviours were automatically developed by means of artificial evolution. First tested on wheels with the Khepera, this approach resulted in a successful vision-based navigation, that did not rely on optic flow. Instead, the evolved controllers simply measured the image contrast rate to steer the robot. Building upon this result, neuromorphic controllers were then evolved for steering the Blimp2b, resulting in efficient trajectories maximising forward translation while avoiding contacts with walls and coping with stuck situations.