Team Six/Final Paper

From Maslab 2013
Jump to: navigation, search

Contents

Introduction

Our robot, named E.A.R.L or Electronic Assistant Research Laborador, essentially is built to pick up balls from the field and deposit them over the yellow wall. E.A.R.L uses an omni wheel drive to allow for mobility and efficiency when traveling around the field. He is a beautiful beast, that underwent many failures and a few successes.

Mechanical/Design

Cad Model

Our mechanical design was decidedly simplistic. We originally planned on having a complicated S-belt drive with a static belt and independent rollers, but we decided that it was too complicated for what we wanted to do. Our next design was a simple belt at a 45 degree angle, designed to pick up balls on one side and drop them off on the other. The belt spanned most of the width of the front of the robot to allow us to pick up as many balls as we could at once. Two flanges on front increased the size to get the full 14" of area that could pick up balls.

For the drive train, we initially debated between two separate ideas. Fred had a lot of experience with 2 wheel drive systems and wanted that, whereas Steven had experience with a 3-wheel omni-wheel drive. Fred liked the simplicity of the system, but Steven advocated for the enhanced mobility of the omni-wheels. Eventually, Fred agreed with him, and our general layout was decided.

Fred then spent the first week of the competition designing the robot. He used simple geometries combined with some power analysis to determine the size of the wheels and their locations, with respect to the ramp. By the end of the first weekend, he had completed a solidworks model that gave us a pretty good idea of what the robot would be like. When designing, Fred kept manufacturing methods in mind so that we could easily and quickly make the robot. The finalized design consisted of 4 plates of laser-cut acrylic, 2 plates of water-jetted aluminum, and 3 bent water-jetted plates of steel. It took an afternoon to assemble them, and other than the 3 wheel adapters required, no milling or turning was required.

A second version of the robot changed out all of the acrylic pieces. One of them had been damaged in the manufacturing process, and two had to be changed to fix a battery terminal interference issue. The main face piece had be re-done as well to facilitate a new sensor array for detecting incoming balls.

Kitchen.jpeg

A gate was placed at the top of the ramp to prevent balls from coming out of the conveyor belt. The gate was powered by a single servo, and the belt by a window gear motor. Eventually, flaps had to be placed onto the belt to ensure that the balls would be picked up and thrown out of the roller. The belt and these flaps were made out of high-friction shelf liner, and sewn together.

In total, there were 4 drive motors and one servo. 3 of the motors were used for the drive train, and the fourth was for belt system. The servo was for the front gate. Sensors were also pretty minimal. There were two IR sensors on the front flanges pointing out 45 degrees at the front two corners that worked very well for wall-avoidance. An array of 10 limit switches carefully placed under the rollers that allowed the robot to detect how many balls it was picking up at any one time. There were also two extended limit switches placed on the back of the robot that allowed for easy lining up of the scoring mechanism. Finally, the camera did most of the vision and tracking work.

Electronics

We have previously explained how EARL, the incarnation of hope and belief of humanity in a metal can, was the created by a group of extraordinary wizards, sages and artisans. One night they assembled in a huge creepy house on a huge creepy hill on a huge creepy island in the middle of a huge creepy bowl of delicious boiling soup. This conclave was so secret that even the carrier pigeons that were supposed to deliver messages between the mighty kingdom of MIT and the island did not know what was going on. And they ALWAYS do.

Anyway, after performing a variety of different non-logical, irrational, tiring, exciting and straight up hilarious rituals, there appeared the mighty EARL, the guardian of the monolith edifice of which all of the participating members were students. For generations EARL protected MIT from external robotics attacks, and although absolutely BOSS at his job, he was fired after a while for illegal transistor bootlegging.

Stripped of his transistors and his dignity, EARL descended into the abyss of the robot criminal world. He began working night shifts at a local strip club, and, apart from carnal robot pleasures, did not indulge himself in anything else.

One winter day a similar group of wizards, sages and artisans gathered together in the hopes of reliving the moments of joy and excitement when EARL came into existence. After much work they finally succeeded in teleporting EARL from the strip club to their magical commons. Poor EARL, living with the rustiest company for a long period of time, had lost all that defined the EARL within him, so the gentlemen had to rebuild him. At that point EARL became a dream, the perfect reality that they craved to achieve, and the EARL they were building was becoming more and more like the imaginary EARL.

The "Nervous System"

Guts and glory

What took them a very long time was the nervous system of the robot, aka "electronics". Thanks to the the original PegBoard, which was like a dummy machine to experiment on, the team quickly got the hang of how the final version of the robot should have worked. EARL's electronics resembled those of the pegbot, but they were elegantly mounted in a dashboard with LEDs and labels to organize the switches. Moreover, the frame of EARL was grounded, adding convenience to the flow of electrons. All the wiring and soldering was done in the first couple of weeks of the competition by Harry, Steven and Will, and did not cause any major problems from a failing point of view.

One of the problems we faced was the poor design of the motor controller circuitry, which meant that if the main power wasa supplied before the logic power, the controller would literally go up in smoke. That being said, BLUE SMOKE was a frequent visitor of our lab, and only after 4 burned motor controllers did we realize the correct way to handle and power them.

One of the original design plans was the incorporation of extremely powerful lights for illuminating the field. Our idea was that with light generated by the robot itself, the camera code would work much easier. However, since the monstrously powerful headlights were drawing an unacceptable amount of current (enough to stop the motors from operating properly), this idea was quickly scrapped.

Just like we were advised by the Maslab staff, we created three different threads of power branching off of the battery. One of them, the most sturdy and powerful one, went to the the Projectors of Righteousness, and took up as much as 10 amps continuously when the lights were on. The second thread, comparable to the first one in sturdiness but much more branched, supplied power to the motors and the LED INDICATORS ON THE TOP. Being 5V LEDs, they all required a resistor attached in series in order to work. Lastly, the third thread coming off of the battery went straight into the 12-to-5 V regulator, after which it was supplied to the brains of the motor controller and the Arduino. The three switched on top of the robot each correspond to one of these threads and can only be turned on in a certain sequence.

General wiring diagram

It is to be noted that Team 6: T9 stumbled upon some very strange problems along the way. Namely, the motors would only work when told to rotate backwards, which had to be taken into the account in the final version of the code. Also, the USB cable seemed to have died after continuous beating and stretching, and had caused the team a lot of problems during the final competition.

Overall, the electronic EARL was almost flawless, ready to fight EVIL left and right. Although there were a couple of hiccups with motor controllers, at the end the team managed to figure them out.

Timed ignition

Simple DelayON timer circuit

One of the ideas that our team had while developing this robot was an automatic DelayON timer circuit that would turn on the main power to the board 1 second after the logic was triggered. While being a very practical and seemingly easy thing to put together, our experience with this type of circuit was very different. Since the reliability and maximum current left more to be desired, we actually ended up frying 2 motor controllers and spending 2.5 days on trying to tweak this Arduino-controlled concoction to work, alas without any positive result.

In the end we decided to settle for airplane switch type of "ignition" - there is a switch for the logic, accompanied by an LED indicator that tells you whether the logic power is on. There is also a similar switch for the main power, which is wired in such a way that does not allow you to accidentally trip the main power before the logic. Simplicity does it, we found.

Programming

For our robot, we used python with OpenCV and the given Arduino library to work with the sensors and motors. In the end, our main program was rather simple. It used a simple looping search algorithm that implemented the vision (and IR) to detect different objects and then process the list of objects to make a decision, setting the motors to a certain state. Our program did use four threads for multi-tasking: a timer, music player, ball counter (returns the current number of balls in the container on the robot), and main thread (everything else). However, our robot only used Position-based tracking for ball/wall following because implimenting PI or PID did not add any efficiency in our program. Furthermore, though mapping was discussed in the initial stages of our program design, it was not implemented.

Vision

Using the OpenCV (Version 2.1) libraries, we were able to use color thresholds to locate objects. A small problem arose when dealing with the lighting conditions of different rooms and varying sunlight. Our first solution was to convert all RGB values into HSV values so that theoretically it would be possible just to have a wide brightness (V) threshold to compensate for different lighting. We attempted to create a "Universal" HSV-value range that would work in most settings. However, as it turns out, with varying types of lights, there are slightly different color values from the different map objects. Our team discussed using Hough transforms to detect different features in the image or Haar cascades to create a more robust vision module. However, we found problems with both methods, including the fact that our vision programmer had never been exposed either method of feature detection, and that he had a simple solution.

Instead of changing our vision code, we made a program to calibrate our vision using a simple GUI and stored the data to a file for future use. This code displays a live feed from the camera (with a small circle in the center). Then, the object is lined up within the circle values of the pixels within the circle are analyzed for the maximum and minimum values. These values are stored to a file and saved for future reference or until they are updated again. This way, calibration of the camera takes a very short amount of time and one does not even have to manually edit the HSV value files.

This is what the camera sees and picks up

The main object detection program is called "rectangulate.py" because it inputs an image, an HSV threshold, and the blur factor (Gaussian Blur) to detect objects. It then converts the image to a single channel (grayscale) image, filtering out any colors outside of the thresholds and finds connected pixels, drawing contour lines around those "blobs". Because of the uneven color of the ground, it was possible for a single pixel or a small group of pixels to show up randomly throughout the image, so those contours were filtered out. Finally, the program calculated bounding rectangles around the largest contours and returned those objects. The calibration data is read from the "HSV_[foo].data" files for each object. This "rectangulate.py" program was used by other helper classes: ball.py for finding the largest ball or group of balls, wall.py for finding a yellow wall, and button.py for finding the teal button, though this last one was not implemented in the final program.

Main Program

The main while loop contained 3 main options (if/elif statements) and an else option. In this program, there were many objects (for every sensor and actuator) in addition to a plethora of counters and booleans to be triggered (or reset) depending on the option:

  • The first if statement read the data from the irSensor object to detect if there were walls in front of the robot. Then, if the robot had previously been chasing a wall (wallLastSeen was 5 or less camera frames before) and contained balls in the dispenser, the robot would call the lineUp() method and line up to the wall using the back "whisker" bumpers.
  • The second if statement detected if there was a yellow wall in front of the robot and balls inside. If both statements were satisfied, it would begin to chase the wall. Furthermore, the wallLastSeen counter would reset to zero
  • The third if statement detected red or green balls in front of the robot. If this statement was validated, the robot would begin to chase the ball.
  • The else clause contained a counter for each time none of the first if statements was not validated (and reset when they were validated). Essentially, if the robot saw nothing in 3 Frames, it would go forward and after 3 times of going forward (9 frames) it would turn right 90 degrees.

This loop was run through once per frame (camera processing was 3 FPS) unless the subroutine within the while loop called time.sleep() or exited the loop temporarily (i.e. the lineUp() method that lines the robot up with the wall).

Overall Performance

At the mock competitions and seeding match, our robot did pretty horribly (only able to pick up one ball), and we ran into multiple problems, such as the traction of the omni wheels and lack of calibration of the camera. We were able to fix the traction of the wheels before the final competition by adding a mixture of sand and black rubber to the wheels. Furthermore, calibration was improved by expanding the H and V values of the calibrations slightly to allow for an improved chance that the robot would see and object. The speed of the main loop was improved by decreasing the size of the image from the camera and decreasing the maximum value of the frame counter so that the robot was almost constantly moving.

After making these adjustments to the programming and design, our robot was able to perform quite well about 60% of the time at the competition. The other 40% was due to the fact that OpenCV would not be able to disconnect the camera and the computer needed to be rebooted between runs. Fortunately, we were usually able to have at least one good run per pair, scoring more points than the other team ever did in the two rounds. We ended up in third place after winning the bronze match, with our robot actually working very well in both runs.

Further complications include that we broke at least 4 USB cables during the final competition most likely because of the manner in which we tied the excess length to shorten the cable (so it would not be dragging). We would like to thank the other teams that donated their USB cables to our robot, only to be destroyed repeatedly by our robot.

Personal tools