About IEEE SoutheastCon Competition
About IEEE SoutheastCon Competition
The IEEE SoutheastCon Hardware Competition is an annual event where student teams design, build, and program autonomous robots to complete specific tasks within a themed scenario. Each year introduces a unique challenge that reflects real-world engineering problems, encouraging participants to apply their technical knowledge and creativity.
2024-2025 SoutheastCon Competition
The scenario of this year competition, called "Mining Mayhem", is set in the year 2047 and challenged teams to design autonomous robots capable of collecting two valuable materials, Geodinium (which contained magnets) and Nebulite (which were hallow inside), from simulated asteroids. Within a three-minute time limit, the robots were tasked with retrieving these astral materials, placing them into designated Cosmic Shipping Containers, and delivering them to a specified Rendezvous Pad. For additional points, teams could also explore a cave-like feature on the game field that offered potential bonus opportunities.
Each team had to perform given tasks in a three-minute time limit and be judged on performance against other university’s team submissions.
Per the rules of the competition:
Maximum Robot Dimensions: 12" x 12" x 12".
Appendages may extend beyond this constraint once the match has begun.
Maximum Weight: 12 kg (26.5 lbs).
Start Button: If utilized, must incorporate a 5-second delay before activation (clearly marked).
Emergency Stop Button: Must be included and clearly marked.
No hydraulic systems; pneumatic parts must not exceed 100 psi.
The robot must operate at 30V or less.
Robot must operate autonomously.
Wireless communication is permitted between robot components, but not with external devices.
The robot must be capable of reading AprilTags.
Beacon Specifications:
Dimensions: Minimum 3" x 3" x 6", maximum 6" x 6" x 8".
Custom labeling is required.
Pole dimensions: 1" diameter, 1.25" long to fit beacon hole.
Time Limit: All functions must cease after 3 minutes.
Overview:
The journey to completing our robot, whom we named Timmi (Technologically Integrated Material Managment Inspector), was filled with challenges and growth. The project was assigned in October, and we spent most of the Design 1 semester gathering materials, sourcing hardware, and finalizing the design. In January, we began building the robot, and much of March was dedicated to testing and fine-tuning the components to ensure all competition requirements were met.
Our robot, Timmi, was designed and built to collect, identify, and sort astral materials within a limited time. The robot uses a combination of mechanical and vision-based systems to navigate the field, locate objects, and perform sorting tasks with precision and efficiency.
Equipped with a sweeping propeller, ramp, and magnetic conveyor belt, Timmi collects materials and delivers them to designated bins based on magnetic properties. Dual camera systems—the Pixy2 and ArduCam—enable object and AprilTag recognition, allowing for intelligent decision-making and accurate navigation.
Powered by an Arduino MEGA 2560 and a Raspberry Pi working in tandem, Timmi brings together hardware and smart vision processing. The entire system was carefully prototyped, 3D-printed, and optimized to stay within strict size, weight, and time constraints.
Motors:
In our initial design, we planned to use the L298N motor driver to control the wheels. However, we found that it delivered uneven voltage to the motors, causing the wheels to move at different speeds and affecting the robot’s accuracy. After exploring several solutions, we decided to switch to MDD10A motor drivers, which provided more consistent and reliable performance for wheel control. The L298N was still used to control the propeller motor, as it only needed to spin continuously in one direction throughout the game.
Hall effect sensors:
During testing, we found the A3144 Hall effect sensors ineffective—they only detected magnetic fields when in direct contact with Geodinium and in specific orientations. Further research revealed limitations: they only sensed one pole, required a 90 gauss change to trigger, and used a fixed internal comparator. We switched to Waveshare sensors with a 49E Hall sensor and LM393 comparator, which allowed analog output and bipolar detection. These provided much better sensitivity—detecting changes as small as 2 gauss—and worked reliably from a greater distance.
However, repeated tests caused interference from nearby components, raising the baseline voltage from 2.5V to 2.6V, which affected detection. We resolved this by shielding the sensors with an aluminum foil enclosure to block external electric fields. We also modified the code so that the signal to change direction would adjust based on how the sensor's sensitivity varied over time.
Processors:
The primary processor used in our robot was the Arduino MEGA 2560, which managed most core functions including motor control and image recognition using the Pixy2 camera. Its compatibility with the Pixy2 and low power consumption made it an ideal choice for these tasks, and the accessibility of the Arduino library helped streamline development.
The one function the MEGA did not handle was AprilTag recognition. For this, we integrated a Raspberry Pi paired with the ArduCam. The Raspberry Pi processed the AprilTags and sent relevant data to the MEGA, which then adjusted the robot’s behavior accordingly.
We initially planned to use only the MEGA, but after discovering its incompatibility with the ArduCam, we added the Raspberry Pi to ensure reliable AprilTag detection and maintain overall system functionality.
Cameras:
Our robot used two cameras, each with a specific role. The Pixy2 camera was responsible for scanning the field and locating astral materials. We trained it to detect objects based on the color purple, and once identified, the robot was programmed to approach them. The second camera, the ArduCam, was connected to the Raspberry Pi and used to read AprilTags. This allowed the robot to identify the correct drop-off locations for the bins.
LDR Light Sensor Module
We decided tot use the LDR Light Sensor to detect the start LED because it provides analog and digital outputs, and had onboard adjustable comparator.
Gyroscope
To improve turning accuracy with our mecanum wheels, we added a gyroscope. The wheels’ rollers caused inconsistent movement, making manual angle adjustments unreliable. By using a gyroscope, we could define a fixed "true north" at the start of the match and command turns relative to that direction. After researching options, we chose the BNO055, an integrated gyroscope, accelerometer, and magnetometer that offers reliable performance without complex calibration. This allowed the robot to make precise turns and self-correct during movement.
Early Design Iterations:
Innovation is a journey of trial and discovery, and our final design was shaped by several early ideas that helped us explore different approaches. One of our first concepts was a robotic arm that would travel, pick up astral materials one by one, identify if they were magnetic, and sort them accordingly. While precise, this method was far too slow to meet the competition’s time constraints.
We also considered a vacuum-based system for rapid collection. However, we quickly eliminated this option due to concerns about excessive power consumption for a single subsystem.
Several other collection and sorting ideas were explored, including a drawstring pulley system, a layered platform, and a magnetic conveyor belt. The drawstring design faced challenges with repeatable dispensing, while the layered platform—which used magnetic and non-magnetic surfaces to separate materials—was too complex and unreliable during the dumping process.
The magnetic conveyor belt idea stood out. It started low to the ground to collect materials, allowing Geodinium to stick while Nebulite fell off the end. Although we anticipated challenges in removing the Geodinium later in the process, the conveyor proved to be a valuable transport solution.
In the end, three ideas influenced our final design the most: the ramp, the non-magnetic conveyor belt, and a sweeping propeller. By combining and refining these concepts, we created an efficient and effective system tailored to the competition's needs.
With the ramp, conveyor belt, and sweeping propeller forming the foundation of our robot, we were able to meet the core design requirements for material collection. The propeller, assisted by the ramp, directed astral materials onto the conveyor belt for sorting.
To detect materials on the field, we integrated the Pixy2 camera, which uses shape and color recognition to identify astral materials. For field navigation and rendezvous point identification, the ArduCam was a better fit due to its ability to read AprilTags. Because the Pixy2 is compatible with the Arduino Mega and the ArduCam works with the Raspberry Pi, both microcontrollers were incorporated into the final build.
Sorting was achieved using Hall effect sensors placed along the conveyor belt to detect the magnetic core of Geodinium, the only distinguishing feature between it and Nebulite. Based on the sensor readings, a selector arm directed each material to the appropriate bin with minimal interference to the conveyor process.
Powering the robot required lightweight batteries capable of sustaining all components, including motors and sensors, for at least three minutes. Many hardware parts, such as motors and wheels, were generously donated by the 2024 IEEE Hardware Competition team.
During prototyping, we 3D-printed the ramp and propeller to test their dimensions and ensure efficient material transfer. The chassis combined PLA 3D-printed parts and an acrylic base to meet both weight and size constraints. Final versions of the ramp, propeller, and conveyor were optimized for slope and angle to maintain efficiency while staying within the one-cubic-foot size limit.
The equation ‘Battery life = Battery Capacity / System Consumption’ was used to calculate the amount of time the battery would run for based on the current draw of the components we used. For each component, the voltage and current were determined based on the associated specification sheets. The voltage was used to determine which power line the component would be connected to (12 volts or 5 volts), while the amperage was used to determine the total system consumption. Devices that pulled negligible current (MDD10A motor drivers) or were connected to processing boards for power were not considered in the current draw, as they would be pulling from the boards rather than the battery itself.
Currents seen for the 12-volt line were: Motors [4 * 0.1 A = 0.4A] + Motor & L298N H-Bridge [seen as the current over two series connected resistors—calculated by simulation after hand calculating resistance to be 12V/0.1A + 12V/0.036A = 453.33 Ω— 0.02647 A] + Arduino [1A] = 1.43 A. Currents seen for the 5-volt line were: Raspberry Pi 4 Model B [1.25 A] + Conveyor Belt Servos [1A * 2 = 2A] + Sorting Servo [1.8 A] + Arm Servos [2.1 A * 2 = 4.2 A] + Hall Effect Sensors [0.0095 A * 3 = 0.0285 A] + Ultrasonic Sensors [0.015 A * 2 = 0.030 A] = 9.31 A. Together, the total system current consumption was calculated to be at most 10.74 A. Using that value and the first equation mentioned, the battery life was calculated to be 4.4 Ah / 10.74 A = 0.41 hour, or 24.5 minutes.
Although individual systems performed well during testing and many integrated successfully, combining all components introduced increasing complexity and made troubleshooting significantly more challenging. As more systems were added, unexpected interactions and technical issues emerged, requiring quick thinking and adaptability from our team.
The night before the competition while we were about to do a last test run a couple of unexpected issues rose up, things that had worked up until now started to have some issues, like the conveyor belt and the motor of the propeller. Despite this setback, our team remained determined to find a solution, so we spent the night solving those issue and we were able to get the robot ready to compete the next morning.
Despite the challenges encountered, our team demonstrated resilience and determination throughout the competition.
Out of 42 competitors, we secured the 14th position. Despite the challenges of the competition, we take pride in our ability to adapt and compete under pressure.
Following the competition, we conducted a thorough assessment of our robot's performance, identifying areas for improvement.
We are committed to learning from our mistakes and making necessary adjustments to enhance our robot's structure and functionality.