Team Nine/Journal

From Maslab 2012
(Difference between revisions)
Jump to: navigation, search
Line 97: Line 97:
 
== ''Day 18: January 26'' ==
 
== ''Day 18: January 26'' ==
  
Continued issues with the roller design leads us to consider different shapes and materials. Achieving a balance of friction and slide is difficult as imperfections of the roller construction catch up with the gripper material. Due to the lack of a robot body, we use pegbot and loads of tape in the mock contest.
+
Continued issues with the roller design leads us to consider different shapes and materials. Achieving a balance of friction and slide is difficult as imperfections of the roller construction catch up with the gripper material. Due to the lack of a robot body, we used pegbot and loads of tape in the mock contest.
 +
 
 +
 
 +
== ''Day 19: January 27'' ==
 +
 
 +
Today was a day of general troubleshooting and cutting out additional parts for our robot. Staphany tried simplifying the red ball code and was disturbed to find that the results were not vastly different from the complex code she had constructed over the past couple weeks. Hans continued to construct the full skeletal structure of the code.
 +
 
 +
 
 +
== ''Day 20: January 28'' ==
 +
 
 +
With only a week remaining, we decided to combine efforts instead of trying to complete smaller tasks individually. Staphany handed over the vision code to Hans, and with Isaac, focused on putting together the final robot. Although we had an idea of how to put together the majority of the robot, mounting the wheels was still an issue as we were unsure of what materials to use and how to balance the weight.
 +
 
 +
 
 +
== ''Day 21: January 29'' ==
 +
 
 +
Hans is able to devise a simplified yellow wall detection code, but the robot body is still under construction, so we are unable to test it in action. The rollers are becoming more and more of a problem; we decide to go with a simpler version of wrapping - a single large sheet - in the hopes that this new design will solve the slippage problem. Another mechanical problem we encountered was the placement of the bottom and middle chutes; with the wobbly rollers still unplaced, we were reluctant to finalize the placement of the chutes.
 +
 
 +
 
 +
== ''Day 22: January 30'' ==

Revision as of 00:20, 4 February 2012

Contents

Team 9 Journal

Sensors 4 IR 16 1 Drive Motor 7 1 Stock Motor 5

TOTAL 28

Day 1: January 9

Yeaaaaa!!!! The first day of Maslab!!! One of our team members had to drop out because of an overload of work :( but we toughed that out, and had an early meeting at 10am (This is early for me...(Oh yea this is Isaac!)). We discussed kind of the general design of the robot, and what our strategy would be to capture balls. We decided to try to shoot balls over the wall and past the purple line for maximum points. We pretty much thought it would take too much time and effort to shoot accurately into the green box. Our robot design looked a little like R2-D2!! XD In general, we would run over the balls, and suck them up into our hopper (ball holder) inside our robot using rollers. Then the balls would be shot out from the rotating head at a high speed using rollers.

Then we went to lecture, and learned some stuff about the what Maslab is, the rules, and a bunch of other Maslab things. Later, we went into lab, obtained our box, opened it, and stared in awe. For the longest time, we were trying to figure out how to use all the things that were in the box. Hans went off and went solder crazy. Staphany and I stared at the parts trying to use our ESP to make the parts build themselves into our robot. When we found out that didnt work, we stuck things together using logical methodology.


Day 2: January 10

After we had gotten the bare bones of our pegbot together, the next checkpoint was to have pegbot autonomously turn away from a wall; however, issues with delay on the arduino prevented us from making much progress. Today's lecture discussed vision and threading - we tried to incorporate threading into our wall-avoiding code, but the lagging arduino prevented us from being able to test the code. We used svn to facilitate code updating and sharing, which was exciting since I never knew such a thing existed (Staphany).


Day 3: January 11

We had our first meeting with Sam and another team to review our schedules and mechanical/strategical design. After the departure of our MechE team mate, none of us knew how to use solidworks, so we showed our 'roller-full' design on pen and paper. Our mechanical design was meant to maximize ball-carrying capacity, speed, and throwing-strength - a few of the characteristics we felt were most important toward our strategical goal of gaining 5 points per ball past the purple line. Continued problems with the arduino stopped us from completing yesterday's checkpoint.


Day 4: January 12

The arduino refuses to work; even with continued updates from Sam and other teams, we could not get it to operate the IR sensors. One issue was that sometimes the arduino would change its port number on its own. Yesterday's lecture talked about mapping and localization - topics covered extensively in 6.01 - but the team felt that mapping out the field would be impressive, but perhaps, overkill. We began to explore openCV libraries to attempt today's checkpoint: turn and drive toward a ball. None of us having had experience with openCV, however, this task continues to elude us for the time being.


Day 5: January 13

It's friday, friday, arduinos don't work on friday.... Just beginning to research openCV, we realize that it has some of the worst documentation for python users. Even figuring out how to download the correct files is a hassle; Isaac couldn't get it to work on a Mac, but luckily Staphany is awesome and uses Ubuntu, so we were able to test some basic functions just to see that it works at all. We haven't even plugged in the webcam yet, so the next checkpoint of camera calibration is an amusing thought.


Day 6: January 14

Finally the weekend has arrived, but we've only just begun. Hans is figuring out our CAD model on solidworks, Isaac is working on the arduino, and Staphany is going through fields of poorly documented openCV python bindings. It looks as though our inexperience and small numbers are dragging us down, causing us to spend a good chunk of our time just familiarizing ourselves with basic tasks. But at least we've acquired some critical building skills, such as soldering!


Day 7: January 15

Everyone is continuing to work on the tasks of yesterday. Solidworks is tedious as it crashes very often, especially when Hans attempts to rotate his model. The arduino is continuing to be ridiculous and Staphany is beginning to see just how evil openCV is in python. The willowgarage online documentation has different sites for C++ and for python, but there appear to be 2 versions for python. Unsure of what the differences are between the 2 versions, Staphany tries both and finds that importing the 'cv' library always works, and importing the 'cv2' library almost never works. So imagine how happy she is to find that cv2 is the more updated and comprehensive library. That, was sarcasm right there. So much. Sarcamsm.


Day 8: January 16

A week has passed, and we are deep into the coding and CADing, but have not yet put out any parts for our final robot. Hans is nearly done with the CAD model, however, so soon, Isaac will be able to start cutting our parts. Staphany has begun stringing together some openCV image processing functions to convert the RGB images captured by the webcam into HSV, in order to filter out all colors but red. The openCV online documentation flaunts many tempting functions for ball identification, and the internet reveals that red ball detection is a common project that has been completed successfully, though mostly in C++.


Day 9: January 17

Today we got PVC pipes and other materials for robot parts. Hans decided to create his own version of the arduino code, which appears to be working. Apparently something in our code was creating a lag... Attempting to take advantage of useful openCV function only available in the 'cv2' library was a fail since Staphany could not figure out how to download/import 'cv2'... sticking with 'legacy python' functions it is then. :(


Day 10: January 18

Though undoubtedly behind schedule, we completed last week's checkpoint of turning away from a wall. Identifying a red ball feels like a hopeless dream, however, as none of openCV's feature detection functions appears to be working as expected. Coders who have posted their successes online usually use C++, which Staphany will learn next week in 6.S096, so translating C++ into python is quite confusing. Isaac compiled a database of sample images on which to test the vision code.


Day 11: January 19

Having finished the CAD model, Isaac and Hans are working on cutting out pieces of acrylic as fast as possible. Do ANY python functions work in openCV??? Just when you think you've got all the red pixels, you try another picture, and you've either detected too much or too little. For some reason, openCV's Hue values range from 0-180, instead of 0-255. Warned about the need for 2 ranges of hue detection due to the wrap around feature of the hsv color wheel, Staphany tried testing the upper ranges of hue, expecting some additional red pixels to be detected; nope.


Day 12: January 20

We decide to use mattress gripper material instead of strings of polyurethane around the rollers. Staphany's openCV vocabulary is increasing exponentially; however, so is her anger, as the new functions fail her again and again. Discussions of sensor strategy arise as we contemplate how many long-range and short-range sensors to use. We agree to use 4-5 spread out on the front half of the robot, somewhat like a spider.


Day 13: January 21

There is ONE function that accurately and precisely detects circles... MINENCLOSINGCIRCLEYESSSSSSS. But of course there is a problem with it - it only detects one circle per image. Another function - HoughCircles - seems very useful, but Staphany has yet to see it work reliably. Having cut out the base and sides of the robot, and fitted the rollers, our robot's skeleton begins to take shape. Hans enhances his CAD model to figure out how to attach the parts together.


Day 14: January 22

We decide to cut out pulleys using acrylic and cut out delrin to hold the PVC pipes in place. The red ball detection code is becoming more and more complicated as Staphany tries to string together various functions to more accurately measure balls. A MASLAB alumnus suggests writing the vision code using only python libraries rather than using openCV, a very disappointing suggestion for Staphany who has been slaving away in openCV every hour of her life for the past 2 weeks.


Day 15: January 23

We cut out 2 sheets of metal for the chute that will guide balls up the body of our robot. Staphany decides to take a break from the ball detection code by working on the yellow wall detection code. At this point, vision code testing is being done on live feed from the webcam, rather than still images. The most problematic aspect of robot building is that it is difficult to drill screws into the delrin and metal brackets perfectly such that the center of the delrin is concentric with the PVC pipe; the result of non-concentric alignment is a bumbliness of the rollers which have a hard time holding the gripper material in place.


Day 16: January 24

After too many hours of buggy vision code, Staphany reaches out to a TA who suggests downloading the GIMP image editor to facilitate pixel analysis. Best suggestion ever. With the help of the GIMP 'pointer' window, Staphany was able to quickly determine the correct HSV values to filter the image pixels and feature detection suddenly became much more satisfying. Choosing to stick with the minenclosingcircle function, as that gave the most reliable circle detection, Staphany decided to use a 'divide and conquer' sort of idea to get over the limitation of only being able to detect one circle at a time.


Day 17: January 25

We decide to cut out new wheels with less sharp edges to prevent the wheels from catching on the rug. The roller covers run into problems with slippage and general muddlement. Hans and Staphany begin incorporating the vision code into the full code. Concerns over whether the vision code will take too much processing power arise.


Day 18: January 26

Continued issues with the roller design leads us to consider different shapes and materials. Achieving a balance of friction and slide is difficult as imperfections of the roller construction catch up with the gripper material. Due to the lack of a robot body, we used pegbot and loads of tape in the mock contest.


Day 19: January 27

Today was a day of general troubleshooting and cutting out additional parts for our robot. Staphany tried simplifying the red ball code and was disturbed to find that the results were not vastly different from the complex code she had constructed over the past couple weeks. Hans continued to construct the full skeletal structure of the code.


Day 20: January 28

With only a week remaining, we decided to combine efforts instead of trying to complete smaller tasks individually. Staphany handed over the vision code to Hans, and with Isaac, focused on putting together the final robot. Although we had an idea of how to put together the majority of the robot, mounting the wheels was still an issue as we were unsure of what materials to use and how to balance the weight.


Day 21: January 29

Hans is able to devise a simplified yellow wall detection code, but the robot body is still under construction, so we are unable to test it in action. The rollers are becoming more and more of a problem; we decide to go with a simpler version of wrapping - a single large sheet - in the hopes that this new design will solve the slippage problem. Another mechanical problem we encountered was the placement of the bottom and middle chutes; with the wobbly rollers still unplaced, we were reluctant to finalize the placement of the chutes.


Day 22: January 30