Team Two/Final Paper

From Maslab 2015
Jump to: navigation, search

Contents

Strategy

Our strategy changed noticeably throughout the class. Initially, we had planned to localize with the camera and rangefinders, store two stacks of blocks sorted by color, and park over the border defining the neutralization zone to drop each stack on its side of the line. We soon found localization to be a thorny problem, and instead tried only using localization at the beginning of the match to determine the angle. In the end, we switched to wall-following and scanning for blocks with the camera. We also at some point decided to change our strategy from parking on the home base line to dropping off stacks with a servo-operated mechanism to open our collection tubes. In the end, we decided to disable this functionality as well, as we were never able to collect more than our maximum number of blocks anyway.

Mechanical

Our robot was built to fulfill the tasks of: rapidly collecting cubes, sorting and storing them, and releasing the resultant stacks. While this approach promotes a modular design, from the start our robot was designed to be fairly minimal, integrating all three objectives in a relatively simple assembly, rather than designing more standalone components to be later put together. As a result, the final product was compact and fairly clean. The disadvantage of this integrated design approach was that it became difficult or impossible to solve individual mechanical issues without having to consider effects on other parts of the assembly.

— collect — The first step in the block pickup-sort-store-drop sequence (hereon referred to as the block-chain) is the collection stage. Two PVC drums with lathed grooves held a set of urethane round-belt loops. Blocks were held between the belts and a vertical metal plate that was bent to roughly match the curve of the bottom drum, to lift the cubes as the belt spun. The belt assembly was mounted on a hinging frame, allowing rubberbands to tension the belt towards the metal wall, while still allowing some freedom to avoid jamming of oblique cubes. As they were conveyed up, curved guide walls channeled the cubes to the center of the top of the conveyor. Short lengths of the belt material ("noodly bits") were mounted to the lower drum and to a small motor midway up the conveyor wall, to jostle cubes into the conveyor and through the narrowing channel without jams.

— sort — At the top of the conveyor the blocks entered the next phase of the block-chain, the sorting mechanism. Specially shaped walls and edges allowed for a smooth transition out of the tension in the conveyor and into a pivoting "hand," triggering a mechanical limit switch in the process, to signal to the controller that a block has reached the top. After the color sensor determined the block's color, a servo motor, connected to the pivoting hand by a sliding linkage (to effectively extend the motor's angle range), moves the cube above one of two holes in the upper deck assembly, where it drops into the stack storage tubes. This entire top assembly, as well as the stack storage tubes, were designed to be easily detached with the removal of two bolts, allowing access to the Edison and other electronics mounted internally to the robot.

— store and deploy — Last in the block-chain was the stack storage and dropping stage. This part was not fully implemented in the robot's final software, due to time constraints, but it was mechanically functional. The red and green cubes were sorted into two chutes made from 3" PVC pipe, each with an actuated bottom which could rotate clear to drop the whole stack to the floor. Half of each chute could hinge open to allow the robot to drive away from the stack after dropping it.

— all together — Wherever possible, we attempted to consolidate all of the mechanical functions of the robot to the same MDF parts, rather than complicating the design with dedicated parts for each purpose. For example, the same pair of vertical side plates served as the main body of the robot; mounting points for the drive motors, motor drivers, the belt assembly, the adjustable-height aluminum conveyor wall; and side walls for the collection ramp. The parallel sides of the belt assembly hold the belt motor and drums, the camera mount, and the piece at the top of the conveyor for the limit switch and color sensor.

Sensors

Arduino/color

In our experience, the Edison is finicky with communication; this prevented us from using the Adafruit TCS34725 color sensor module directly with the Edison. The sensor could reliably distinguish between the two colors of cubes (by comparing the ratio of the red to green values to a threshold), but we could only successfully communicate to it via I2C on an Arduino. Consequently, we used an arduino for the sole purpose of outputting a digital High or Low depending on whether a red or green cube was placed in front of the sensor, which was wired to an Edison GPIO.

Ultrasonic/rangefinder

We used ultrasonic and infrared sensors as rangefinders. Each of these sensors proved to be notably flawed; ultrasonic sensors do not give good data if pointed at an angle, and infrared sensors return a given reading for two separate distance values. For these reasons, we integrated the two sensors into a 'rangefinder' class that retained information about readings and compared the two values to give a more reasonable estimate of what distance was actually being detected.

Gyro

The gyro does have a bias and the bias drifts. Taking fifty initial readings when the gyro is initialized to find a bias and subtracting this bias from the readings was sufficient to mitigate. Along with using a kalman filter to reduce the effect of fluctuations in the angular velocity measurement, these measures were enough to keep our angle within two degrees of the actual value throughout a match.

Camera

For the camera we used OpenCV to interface with the camera and receive the images as a matrix (mat) of color(vec3b). OpenCV wraps the more complicated v4l2 commands which deal more directly with the camera firmware. We used linear min thresholds(ex: block is red if r>1.5*g && r>1.4*b) on r,g and b to identify the colors (red and green for blocks, blue for walls, and purple for the neutralization zone). We tried using HSV filtering but that didn't work as well in our experience. From therem we detected blocks by finding adjacent pixels of red or green over a certain threshold number of pixels. The highest y point on the block was used to determine block distance relative to the camera using the known camera width and height of view, direction and trig. Distance measurements proved close enough to be useful but more securely mounting the camera than just bolting it to the frame, and using a calibration matrix to mitigate the effects of lens distortion could have improved the accuracy. Walls were identified by finding five or more blue pixels on top of each other. If the wall condition was met, we eliminated(turned black) all pixels on top of the identified wall to remove distractions. Our camera was mounted a bit high to make this not entirely fail-safe when the robot was close to a wall. Mounting the camera at about the height of the wall would have helped. A lower mounting point would also help with sensing nearby cubes and approaching them better which we couldn't do once the robot was closer than 13 inches to a cube. Instead, we had to rely on estimated distance measurements when we were in that range, which were themselves a bit off.

Software

Firmware

Our code involved firmware classes for all sensors and parts of the robot. Slightly higher-level classes, such as a drive train and rangefinder class, were also written. Each of these classes was fairly simple, and generally did basic conversion of commands and requests to signals sent and received through pins.

Mid-level

Our robot was controlled via a set of classes that integrated multiple firmware classes. These included classes for wall-following, scanning for blocks, and picking up blocks, each of which controlled the drive train and took data from the sensors that it needed. These classes were controlled by the highest level of code.

High-level

High-level code consisted of a single class which contained a loop and switched between mid-level classes as needed. For example, this class could order the robot to wall-follow, scan for blocks, etc.

Troubleshooting and things we fixed

One large problem that we encountered was localization. The initial localization code that we wrote was intended to use a basic Markov chain Monte Carlo approach to localization, using three rangefinders. Multiple issues became apparent, including a low number of available pins and a slow convergence. We decided a week and a half in to abandon the idea of continuous localization and instead attempted to implement a type of localization that could be run for a few seconds when the robot was confident of its location, and would return an estimate of the angle in which it was pointing. This localization appeared to work better in some contexts, but did not give accurate angles when the robot was in a somewhat-symmetrical area of the map. A day prior to the competition, we fully abandoned localization and switched to a simple system of scanning for blocks with the camera and wall-following when no blocks were in the vicinity.

We also found that it was important to write time-outs into all parts of our code; for example, if there was no time limit on drive-train instructions, the robot would often get stuck in corners or tight spaces.

Personal tools