Development of Bio-Inspired Roving Swarm Test Platform

Rasika Kale, Embry-Riddle Aeronautical University
Noa Teed, Embry-Riddle Aeronautical University

Abstract

The first crucial step in testing a bio-inspired system based on the behavior of eusocial insect colonies is to identify their basic motion, sensing and feedback abilities. We have created a colony of 24 robots or a “Roving Swarm” to mimic these insects with low computation requirements. These robots can perform 3 DOF translational motion and 1 DOF rotational motion. This motion is simulated and implemented with Arduino 33 IoT with power output from 1.2V batteries. DC motors connected to the batteries power the motion of wheels. A ball bearing placement with structural symmetry and selective power output provides the rotation. The main sensing ability is vision through Pixy2 cameras which will help the swarm to detect the light and color emitted from other robots and external light sources. This mounted vision system enables real-time tracking of fellow swarm members. This ability will help us to evaluate the swarm’s response to light density, colors, and respective response to the motion of other robots in the swarm. The next significant step in this project is to build a test bed of 400 square feet with set boundaries. The bio-inspired system program will be implemented and tested within this space.

 

Development of Bio-Inspired Roving Swarm Test Platform

The first crucial step in testing a bio-inspired system based on the behavior of eusocial insect colonies is to identify their basic motion, sensing and feedback abilities. We have created a colony of 24 robots or a “Roving Swarm” to mimic these insects with low computation requirements. These robots can perform 3 DOF translational motion and 1 DOF rotational motion. This motion is simulated and implemented with Arduino 33 IoT with power output from 1.2V batteries. DC motors connected to the batteries power the motion of wheels. A ball bearing placement with structural symmetry and selective power output provides the rotation. The main sensing ability is vision through Pixy2 cameras which will help the swarm to detect the light and color emitted from other robots and external light sources. This mounted vision system enables real-time tracking of fellow swarm members. This ability will help us to evaluate the swarm’s response to light density, colors, and respective response to the motion of other robots in the swarm. The next significant step in this project is to build a test bed of 400 square feet with set boundaries. The bio-inspired system program will be implemented and tested within this space.