ERF Robotics Hackathon: Autonomous Robot Navigation
Summary
At the European Robotics Forum 2022 "Battle of Institutions" Hackathon in Rotterdam, our UvA team programmed two Lely Juno robots to autonomously navigate a simulated barn environment. The Lely Juno is an automatic feeding robot designed to push back hay to cows in real barns.
The challenge consisted of 9 scored tasks including wall following, narrow passage coordination between two robots, obstacle avoidance, homing to a base station, and proximity-based light activation. We competed against 7 other European university teams on challenges from Lely, Franka Emika, and Robo House.
Implementation
- Dual navigation strategies. One Juno used a behaviour-based wall-following algorithm with three sonar distance sensors, switching actions based on right-wall and front distances. The other used a model-based approach with a LiDAR scanner for SLAM (Simultaneous Localization and Mapping), creating a cognitive map and following waypoints via a finite state machine.
- Multi-robot communication. A ROS master-client architecture where robots coordinated passage through narrow corridors using boolean messages. The master Juno had priority; the client would stop and back up when the master was in the passage.
- Color-based homing. A blob detection algorithm using an external webcam to locate the home base (an orange traffic cone). The center of the detected blob determined left/right steering adjustments. At the ERF, the detection color was switched to green to avoid confusion with the many red objects in Lely's arena.
- Vicinity lights. Blob detection was reused to estimate inter-robot distance. When one Juno detected the other's color within range, IoT-connected Shelly Plug-S smart plugs were triggered via MQTT to turn on lights.
- Developer monitoring. ROSboard provided a real-time web dashboard showing LiDAR scans, costmaps, sensor readings, and robot state.
Challenges
The LiDAR could not be center-mounted on the Juno due to its physical design, limiting the scan to less than 360 degrees and degrading the SLAM map quality. This forced a switch from model-based to behaviour-based navigation for one robot. The second Juno lacked sonar sensor mounts entirely, so it navigated purely via camera-based blob following. These hardware constraints required significant adaptation of our pre-tested TurtleBot3 solutions during the hackathon itself.