Team Eleven/Final Paper

From Maslab 2013
Jump to: navigation, search

Contents

Who We Are

We are Team Eleven or otherwise known as Team Stevie the Cat. Our inspiration is drawn from Stevie the Cat, a feline so fierce and admirable that he came third in EC's presidential election. He never received a car for his 16th birthday so we decided to build him a robot as reparation; thus the Stevie Wagon came into being.

Stevie.jpg

Members

The acolytes of Stevie include Charles Franklin, Valentina Chamorro, and Yanni Coroneos.

  • Charles is the Junior member of the team and embedded programmer.
  • Valentina is one of two frosh on the team and the resident Mech E. Also, it was discovered during the course of Maslab that she may or may not have narcolepsy.
  • Yanni is the other frosh. He worked on both the programming and mechanical side of things.

Together, the three of us devoted our IAP's to our ruthless leader Stevie in hopes that we may bring him glory and mobility.

Valentina.jpg

[left to right: Valentina, Yanni, Charles]

Overall Strategy

Game Objective

The setup of this year’s Maslab competition was as follows: Robots are placed into a field with an assigned color of balls. Points can be earned by collecting the balls, or scoring them in one of several locations. Scoring points in the pyramid receives the most points (the higher up the greater the points earned); meanwhile, depositing balls over one of the yellow walls receives less points. The competition requires robots to navigate the field completely autonomously using a camera and sensors.

Our Objective

We set out to maintain a simple as possible strategy. We figured that when one gets bogged down trying to accomplish “everything,” then more things are likely to go wrong. Therefore, our strategy consisted of collecting as many balls as possible and dumping them over the yellow scoring wall during the last ten seconds. The way we saw it, the pyramid was a tease and if our robot was predisposed to go for the yellow wall, then we could effectively negate any “scored” points made by the other team by returning their balls to their side of the field. This required a strong codebase and dependable mechanical design. We used the checkoffs as a guide for progressing our design and software. Most of our code was completed prior to the construction of the actual robot as a result of the checkoffs. Though perhaps a more in tandem approach might have been more effective, this allowed us to dedicate solid working hours to construction. Due to time constraints and other commitments, however, by the end our strategy only consisted of collecting balls which ended up being a solid approach for racking up points. The rest would be left up to actual execution of the program and physical product during testing and the competition.

Mechanical Design

Our team began sketching ideas for the design over winter break in an attempt to get ahead of the competition. Our designs, though rather different from our final product, centered around the same main idea: a square base to maximize area and a simple ball collection mechanism.

Design Components

  • Materials: Most of our robot was constructed out of acrylic which was readily available in lab. Wanting to be precise and avoiding chance of cracking it on a band saw, we decided to laser cut all our pieces. Unfortunately this lead to a delay in our construction process because none of us had access to a laser cutter and had to rely on hall mates. Nevertheless, by the third week of IAP, Valentina was given access to a laser cutter and Yanni had access to the shop at CSAIL which meant we could machine whatever we wanted. Additionally, we managed to get rid of most if not all the duct tape used during the initial construction process. We decided on going with Velcro, a much more legitimate adhesive (much to Yanni’s pleasure), and used it for the laptop and internal electronics.
  • Frame: Our robot was put together out of the laser cut acrylic. L-brackets were the fasteners of choice. The way the frame was designed, two adjacent walls were made out of clear acrylic allowing us to see easily into the robot, useful when making modifications. Secondly, the top collection bin was movable and acted like a hood of a car (appropriate for the Stevie Wagon). Upon rotating it upward, we could access all of our electronics.
  • Rubber Band Roller: We decided that the best way to bring in the balls would be a roller that would extend along the majority of the front edge to optimize capture ability. We laser cut pieces of acrylic that would be held together on an axel using collars we machined and the tension provided by the rubber bands. As for the connection to the motor, we initially tried using gears but they would not mesh properly with each other. These problems led us to machine pulleys for the shafts and use a custom sized urethane belt to connect the motor pulley and the roller pulley. This made our roller dependable.
  • Collection Bin: The bin was water jetted out of an aluminum panel off a computer case (thanks Yanni) and bent so that it would serve as both a collection area and stopper for the balls. The idea was that the bin would direct the balls to a lifting belt at the back of the robot by way of gravity as it would be slightly sloped. With the last minute design changes however, the bin was modified to just be a collection bin forgoing the belt in the back. Here our choice of footprint served us well because we had a large surface area optimal for collecting a large quantity of balls with little room for mechanical issues.
  • Lifting Belt: Though this never actually came to fruition, we did spend time thinking about and working on it. Our design was to use two PVC pipes as pulleys for a belt that would compress the ball against the back panel of the robot and two curved pieces of aluminum for directing its movement. We were initially going to use urethane for the lifting belt but McMaster was too slow in shipping and we ended up using shelf liner (thanks T-9)…or at least that was the plan. When it came time to actually assemble the structure, we found there were complications in making everything fit properly. The required orientation of the pipes in the acrylic holds in order to make the pulleys turn without friction left too much room between the ball and the bottom plate. These issues meant that there would not be enough points of contact for the belt to actually have effect on the ball. Unfortunately, there wasn’t much time left to resolve this and thus we had to reevaluate our strategy. The final decision was (as aforementioned) that we would only focus on ball collection. This left the top collection bin (car hood) to serve as a resting spot for the laptop.

Wagon1.jpg

Sensors

Sensors are an important part of any robot design, as they allow the robot to perceive its surrounding world. While most of the perception is done by the webcam, other sensors play a critical role in feedback systems and correcting imperfections from the world perceived by the webcam. We ourselves use feedback all the time while walking, reaching and grabbing, and performing other daily tasks. We use our eyes, sense of touch, as well as other input sources. Without feedback, performing an otherwise simple task could prove to be impossible. The sensors we used on our robot include IR distance sensors, limit switches, and an IMU. The IMU includes magnetometers, accelerometers, and gyroscopes.

  • IR Sensors: were used to accurately detect the distance the robots soundings, which the webcam does poorly. The IR sensor outputs an analog voltage, which is inversely proportional to the distance with an offset. The arduino converted this voltage to a digital number using an ADC and then took it’s scaled inverse to linearize the input and to ultimately get the distance in inches. The input needed to be linearized over three different regions, as the response curve of the sensor was not a perfect inverse of the distance. These readings were then used to avoid oncoming obstacles. These sensors could have also been used, in junction with the IMU, to create a map. This was not done.
  • Limit Switches: were used in junction with the IR sensors to detect whether the robot has bumped into an obstacle. When depressed, the limit switch closes a circuit, which shorts a digital pin on the arduino to ground. This pin uses an internal pull up resistor, which holds the input voltage high otherwise. When the voltage was read as low, the robot would attempt to avoid the obstacle.
  • Gyroscope: was used to detect the angular position of the robot. Gyroscopes measure angular velocity. Taking one integral over time of this value produces the angular position. One negative properties of gyroscopes is that they produce what is called drift. When not in motion, gyroscopes will read some nominal non-zero value. This value, when integrated over time, causes the read position to appear to be changing. This was compensated by measuring the drift for 5 seconds, then subtracting the drift from the integral. This method reduced the drift to about a degree per minute, which is acceptable in a 3 minute competition. The accelerometer and magnetometer could have been used with the gyroscope in a filter to produce almost zero drift.
  • Encoders: Although never used, encoders were mounted onto the robot. This would have been useful to determine how far the robot has moved. This could have also been used to help make an accurate map.

Working together, the sensors helped to produce a perception of the world for our robot.

Software Design

Arduino

The arduino firmware managed the low level interactions with sensors, motors, and general input and output. C++ classes were used to manage the sensors and motors to help avoid repeat code. Tasks on the arduino were mostly limited to computationally simple things. For example, tasks like linearizing the results from the IR sensor. The arduino also handled periodic tasks such as keeping track of the angular position. The gyroscope was polled at a sampling rate of 60Hz and the results integrated to get the angular position. The arduino talked to the computer using the serial to USB converter built in to the arduino. It used a simple protocol created for this project. The arduino would respond to being polled with a three-digit code, representing the action the arduino should take or the information it could return. The arduino would run its tasks while waiting for commands from the computer.

Python

The main controller was implemented as a simple time machine coded in python. Before starting the main loop, the main controller setup communications with the arduino and the vision code server. Once everything was ready to go, it entered the main loop. Every loop, it would poll the arduino for information on the robots surroundings. This information included readings from the IR sensors and IMU. The code would also poll the vision code server for ball locations. A different function was written for each state. The information from the arduino was wrapped into a dictionary and passed to the appropriate state function. All of the states fit into three different categories, ball search, ball track, and evade. All of the ball search states randomly searched the field for balls. These states would immediately switch to one of the ball track states once one is detected by the vision code. The track ball states took the information from the vision code and used it to capture the ball. Both the search and track states called an obstacle detection function. This function took the information dictionary and parsed it to detect the position and proximity of potential obstacles. If an obstacle is found, the next state is overridden by this function to the appropriate evade state. The evade states are used to back and turn away from obstacles. They ignore all input. Every state function sends corresponding commands to the arduino and returns the next state. To cutback on repeated code, helper functions were written that could be called by any state. These functions were mostly to send or receive information from the arduino.

Computer Vision

The computer vision was implemented entirely in C++ with opencv and the cvblob library. The process of identifying balls was actually quite straightforward: The camera would take a picture and then it would be converted from an RGB colorspace to an HSV colorspace. This resulting image would be run through a set of HSV color filters for each different color item we were looking for (balls and walls) and a set of binary images would be returned. These binary images, in turn, were run through a blob detection algorithm from the cvblob library that would identify all of “blobs” of connected pixels. Blobs with extremely small areas were discarded and the rest, which were our vision targets, were kept. We benchmarked this code and found that the netbook could run 3 or 4 filters in under a second. This was more than fast enough especially considering that after optimizations there would be no more than 2 filters being run at any one time. Why did we use the HSV colorspace instead of RGB? As it turns out, the hue of a color does not significantly change in different lighting conditions. As a result, using HSV requires NO camera calibration and less color filters which simplifies code.

Python – C++ Communication

Because half the code was in python and the other half was in C++ communication between the 2 parts was not so simple. We initially considered pipes but dropped them in favor of BSD style sockets due to their increased flexibility. The computer vision code would be the server and the python code would connect to it as a client. The server was always running and identifying targets but would only send back the information upon the client’s request. In order to do this a non-blocking socket receive function was implemented on the server side by threading the server portion of the computer vision code. An especially interesting implication of this type of communication mechanism is that the computer vision code can be running on a completely remote machine!

Overall Performance

Despite the setbacks we encountered, we managed produce a robot that would accomplish the basic task of the competition which is to pick up balls. However, the robot was unable to reliably pick up balls. This was a result of our wires not staying in their place. In fact, in our first match, the robot was unable to start due to a loose jumper wire. In our second match however, the robot successfully executed its programming and picked up a ball. The overall outcome of the two matches was a tie which was then decided by the number of balls displaced which made us lose to the other team. Being a single elimination competition we were out for the count. Regardless, our robot can adequately perform what it is mechanically able to do and Stevie would certainly be proud. Meow.

Conclusion

Maslab was definitely a good experience and a worthwhile part of our IAPs. It is work intensive for a small team, so plan accordingly and be organized and motivated. What each of us took from this experience is the importance of thoroughly planning out your design before starting serious work but also doing so ahead of time so this process isn’t rushed. For Yanni and Charles, the programming was fun and enlightening in that it posed interesting problems. For Valentina, it offered the chance to get familiar with SolidWorks as well as become more conscientious when it comes to the design process. And of course not to mention, Stevie got his long awaited car.

Protips

  • Unless you have access to the shops, whether it be Edgerton or CSAIL or a random UROP, the MechE side of production may become difficult so make friends if you don’t have access yourself. This is especially true for freshmen.
  • Be familiar with OpenCV.
  • Don’t get stressed, it is IAP after all.
  • Put effort into understanding your team members’ points of view.
  • There are benefits to working in lab.
  • Have fun.
Personal tools