Team Six/Final Paper

From Maslab 2013
(Difference between revisions)
Jump to: navigation, search
Line 19: Line 19:
 
= Electronics =
 
= Electronics =
  
foo
+
We had a dashboard of switches and lights.
 +
 
 +
We grounded the frame of our robot.
 +
 
 +
Note we had trouble with the USB connection during the final competition. We hypothesize this occurred due to damaging the cable by stretching or due to damage at the USB jack of the computer when the robot brush against walls.
  
 
= Programming =  
 
= Programming =  

Revision as of 07:03, 4 February 2013

Contents

Introduction

foo

Mechanical/Design

Our mechanical design was decidedly simplistic. We originally planned on having a complicated S-belt drive with a static belt and independent rollers, but we decided that it was too complicated for what we wanted to do. Our next design was a simple belt at a 45 degree angle, designed to pick up balls on one side and drop them off on the other. The belt spanned most of the width of the front of the robot to allow us to pick up as many balls as we could at once. Two flanges on front increased the size to get the full 14" of area that could pick up balls.

For the drive train, we initially debated between two separate ideas. Fred had a lot of experience with 2 wheel drive systems and wanted that, whereas Steven had experience with a 3-wheel omni-wheel drive. Fred liked the simplicity of the system, but Steven advocated for the enhanced mobility of the omni-wheels. Eventually, Fred agreed with him, and our general layout was decided.

Fred then spent the first week of the competition designing the robot. He used simple geometries combined with some power analysis to determine the size of the wheels and their locations, with respect to the ramp. By the end of the first weekend, he had completed a solidworks model that gave us a pretty good idea of what the robot would be like. When designing, Fred kept manufacturing methods in mind so that we could easily and quickly make the robot. The finalized design consisted of 4 plates of laser-cut acrylic, 2 plates of water-jetted aluminum, and 3 bent water-jetted plates of steel. It took an afternoon to assemble them, and other than the 3 wheel adapters required, no milling or turning was required.

A second version of the robot changed out all of the acrylic pieces. One of them had been damaged in the manufacturing process, and two had to be changed to fix a battery terminal interference issue. The main face piece had be re-done as well to facilitate a new sensor array for detecting incoming balls.

A gate was placed at the top of the ramp to prevent balls from coming out of the conveyor belt. The gate was powered by a single servo, and the belt by a window gear motor. Eventually, flaps had to be placed onto the belt to ensure that the balls would be picked up and thrown out of the roller. The belt and these flaps were made out of high-friction shelf liner, and sewn together.

In total, there were 4 drive motors and one servo. 3 of the motors were used for the drive train, and the fourth was for belt system. The servo was for the front gate. Sensors were also pretty minimal. There were two IR sensors on the front flanges pointing out 45 degrees at the front two corners that worked very well for wall-avoidance. An array of 10 limit switches carefully placed under the rollers that allowed the robot to detect how many balls it was picking up at any one time. There were also two extended limit switches placed on the back of the robot that allowed for easy lining up of the scoring mechanism. Finally, the camera did most of the vision and tracking work.

Electronics

We had a dashboard of switches and lights.

We grounded the frame of our robot.

Note we had trouble with the USB connection during the final competition. We hypothesize this occurred due to damaging the cable by stretching or due to damage at the USB jack of the computer when the robot brush against walls.

Programming

For our robot, we used python with OpenCV and the given Arduino library to work with the sensors and motors. In the end, our main program was rather simple. It used a simple looping search algorithm that implemented the vision (and IR) to detect different objects and then process the list of objects to make a decision, setting the motors to a certain state. Our program did use four threads for multi-tasking: a timer, music player, ball counter (returns the current number of balls in the container on the robot), and main thread (everything else). However, our robot only used Position-based tracking for ball/wall following because implimenting PI or PID did not add any efficiency in our program. Furthermore, though mapping was discussed in the initial stages of our program design, it was not implemented.

Vision

Using the OpenCV (Version 2.1) libraries, we were able to use color thresholds to locate objects. A small problem arose when dealing with the lighting conditions of different rooms and varying sunlight. Our first solution was to convert all RGB values into HSV values so that theoretically it would be possible just to have a wide brightness (V) threshold to compensate for different lighting. We attempted to create a "Universal" HSV-value range that would work in most settings. However, as it turns out, with varying types of lights, there are slightly different color values from the different map objects. Our team discussed using Hough transforms to detect different features in the image or Haar cascades to create a more robust vision module. However, we found problems with both methods, including the fact that our vision programmer had never been exposed either method of feature detection, and that he had a simple solution.

Instead of changing our vision code, we made a program to calibrate our vision using a simple GUI and stored the data to a file for future use. This code displays a live feed from the camera (with a small circle in the center). Then, the object is lined up within the circle values of the pixels within the circle are analyzed for the maximum and minimum values. These values are stored to a file and saved for future reference or until they are updated again. This way, calibration of the camera takes a very short amount of time and one does not even have to manually edit the HSV value files.

The main object detection program is called "rectangulate.py" because it inputs an image, an HSV threshold, and the blur factor (Gaussian Blur) to detect objects. It then converts the image to a single channel (grayscale) image, filtering out any colors outside of the thresholds and finds connected pixels, drawing contour lines around those "blobs". Because of the uneven color of the ground, it was possible for a single pixel or a small group of pixels to show up randomly throughout the image, so those contours were filtered out. Finally, the program calculated bounding rectangles around the largest contours and returned those objects. The calibration data is read from the "HSV_[foo].data" files for each object. This "rectangulate.py" program was used by other helper classes: ball.py for finding the largest ball or group of balls, wall.py for finding a yellow wall, and button.py for finding the teal button, though this last one was not implemented in the final program.

Main Program

The main while loop contained 3 main options (if/elif statements) and an else option. In this program, there were many objects (for every sensor and actuator) in addition to a plethora of counters and booleans to be triggered (or reset) depending on the option:

  • The first if statement read the data from the irSensor object to detect if there were walls in front of the robot. Then, if the robot had previously been chasing a wall (wallLastSeen was 5 or less camera frames before) and contained balls in the dispenser, the robot would call the lineUp() method and line up to the wall using the back "whisker" bumpers.
  • The second if statement detected if there was a yellow wall in front of the robot and balls inside. If both statements were satisfied, it would begin to chase the wall. Furthermore, the wallLastSeen counter would reset to zero
  • The third if statement detected red or green balls in front of the robot. If this statement was validated, the robot would begin to chase the ball.
  • The else clause contained a counter for each (OKAY IMMA GO TO SLEEP NOW BECAUSE I AM SHIVERING FEVERISHLY SORRY ;)

Overall Performance

foo

Conclusions and Suggestions

foo

Personal tools