Researchers from the Self-organizing Systems Research Group at the Harvard John A. Paulson School of Engineering and Applied Sciences and the Wyss Institute for Biologically Inspired Engineering have created a swarm of robotic fish that can behave and adapt just like real fish (or at least an approximation of real fish!).
Each fish, called Bluebots, has on-board a Raspberry Pi Zero W and two camera modules, allowing stereoscopic (3D) vision. The Zero in each fish is attached to an Arducam Multi-camera adapter so that it can use the two camera modules at the same time.
Each Bluebot operates independently, using its cameras to spot LEDs on each of the other fish and coordinate its behaviour with its nearest neighbours. This leads to pseudo-emergent behaviour as the fish collaborate on a goal. This can be anything from following each other in a shoal to spotting food being dropped from above to doing more complicated things such as search-and-rescue. The Bluebot is armed with little more than the knowledge that the LEDs on the other robots are a specific distance apart. Very impressive stuff!
“Each Bluebot implicitly reacts to its neighbours’ positions,” explains Florian Berlinger, a PhD candidate at SEAS and Wyss and first author of the research paper, per a press release. “So, if we want the robots to aggregate, then each Bluebot will calculate the position of each of its neighbours and move towards the centre. If we want the robots to disperse, the Bluebots do the opposite. If we want them to swim as a school in a circle, they are programmed to follow lights directly in front of them in a clockwise direction.”
You can see a presentational video of the project below and read a lot more about it over at Harvard, IEEE, Wired, Gizmodo and Arducam. You can also read the team’s full paper over at Science Robotics, but you do have to be a subscriber for that.
Warning: the video below contains flashing lights which, frankly, gave me a headache, let alone anyone with visual/mental problems relating to that sort of thing!