https://maslab.mit.edu/2011/w/index.php?title=Special:NewPages&feed=atom&hideliu=&hidepatrolled=&hidebots=&hideredirs=1&limit=100&namespace=0Maslab 2011 - New pages [en]2024-03-28T10:49:12ZFrom Maslab 2011MediaWiki 1.16.0https://maslab.mit.edu/2011/wiki/Team_Nine/Final_PaperTeam Nine/Final Paper2011-02-01T02:54:45Z<p>Red Lion: /* Photo Gallery */</p>
<hr />
<div>== Suggestions for Future Teams ==<br />
1. Start as early as you can. Getting stuff done over the summer means you can relax more and not pull all nighters during IAP<br />
<br />
2. Read past years wikis and the tutorials. They should point you in the right direction.<br />
<br />
3. Get used to CAD for the meches and get used to Java and the Ubuntu command line for the programmers. It helps a lot to be able to build once from a CAD or to be able to minuplate you r computer without a GUI to save processing power (which will be quite limited)<br />
<br />
4. Design a modular robot. If you have funky systems (like omniwheel drive) that needs work from programmers, build a chassis that can be taken off, so they can play with it to code while the meches continue working on the rest of the robot. We didn’t do this until it was too late and therefore got no (literally zero) testing before the final competition.<br />
<br />
5. Lay out a schedule and STICK TO IT (we had one, but we kept pushing it back)<br />
<br />
6. Don’t try to thread acrylic unless u never plan on removing the screw. Either the acrylic will crack or the threads will strip after two uses. Go for nuts and bolts, they spread out the weight better anyways<br />
<br />
7. Have your nuts and bolts in easy to reach places. Our were hidden behind motors and rollers and it made deconstruction tough despite our modularity<br />
<br />
8. Don’t epoxy until you are SURE your design is FINAL<br />
<br />
9. Be careful of weight. If you use heavy materials like acrylic, sheet metal, and PVC, surprise, your robot will be heavy. Sacrificing smooth looks to cut weight is the smart move here.<br />
<br />
10. Use a mouse for odometry, or better yet two, to have both an x, y, and theta coordinate.<br />
<br />
11. The gyros are pretty bad, the encoders are worse, the mice work great, try to compensate. Another workaround for the gyros are compasses but we didn’t fool around with these.<br />
<br />
12. Back up your code onto somewhere other than the repository. We lost the last week or so of our code on competition day when it was all mysteriously deleted. Not that it really mattered since we had so many other problems but be careful.<br />
<br />
13. Finally, KEEP YOUR DESIGN SIMPLE, as a team with a majority meches, we got a little carried away on the design, and our brainstorming sessions led to a huge, heavy, super complicated robot. It was pretty cool, but the coders didn’t have enough time to test and the code was very complicated anyways.<br />
<br />
== Overall Design ==<br />
<br />
Originally intending to build as simple a robot as possible, long nights of nothing but throwing ideas back and forth left us with quite the complicated design. From our chain linked, vertically mounted drive train, to our lidar system of mapping (aka two ir sensors mounted on servos), our design quickly became more of a task then we were probably able to handle. Still in the end we ended up with a pretty freaking cool robot.<br />
<br />
'''Strategy'''<br />
Voltron is a robot that only goes for putting balls over the wall. Its basic principle is that it has a rubber band roller in the front to scoop up balls, and a large roller in the back to lift balls up to the second level. in hindsight one roller couldve performed both tasks, but hey hindsight is 20/20. When the balls get up to the second level they are held in a collection bay until the robot drives up to the yellow wall, then the gate lowers and the balls are released. The gate is long (6") to allow a factor of safety in aligning with the wall. Other features are two ir sensors mounted on servos that can map out 180 degrees around the robot without the robot itself having to turn. It is basically a poor man's LIDAR. We also have 4 omni wheels that allow our robot to move and turn in any direction, allowing us both to strafe and to rotate. Finally, we have a gyro to align the data from the LIDAR, two bump sensors to determine when we have hit the yellow wall, and an optical mouse for odometry (we originally wanted two but we didnt have time to install it, nevermind code for it)<br />
<br />
'''Materials'''<br />
<br />
Our robot was constructed entirely of acrylic and sheet metal, along with any other parts we had ordered, including sprockets, chains, steel axles, and angle stock. We chose acrylic because it looks hella cool and we thought Voltron deserved to look good. We wanted to be able to see the ball over its entire journey through the bowels of our robot, so acrylic made sense. Its also very easy to lasercut and drill through, the only problem is it was kind of heavy, pretty brittle, and it was sparse. We got lucky in that we found extra acrylic lying about, but we couldnt always count on being able to replace broken acrylic pieces. With the extreme weight of our robot from the acrylic and sheet metal, we decided that the gears running the wheels and rollers should metal instead of plastic so they wouldn't strip. This turned out to be quite unnecessary as I dont believe any other team had stripping problems, and our robot became that much heavier (and more expensive). As you can imagine we had a very heavy robot.<br />
<br />
'''Sensors'''<br />
<br />
Outside of the typical uorcboard, camera, ir senors, bump sensors, servos, and 2 or 3 motors, our design required, an optical mouse for odometry, 3 additional motors, and with that an arduino. Each of these components had to be mounted, of course, and we did so by creating seperate sheet metal cases for the arduino, eepc, and orcboard, and mounting the servos and ir sensors, as well as the camera, on a senors board located underneath the dock where we collected balls. The 4 drive motors were all located above there respective omni wheel, and connected via chains. <br />
<br />
'''Vision'''<br />
<br />
In the end, we opted for a very simple approach with our vision code; frames were taken as quickly as possible, and each pixel was scanned and sorted by color. Neighboring pixels of like cluster were then sorted into blobs, and blobs were treated as viable objects: red and green ones were balls, blue ones were the lines atop the walls, and yellow were goals and scoring walls. This simple approach was relatively fast and could reliably pick up walls and balls; we accelerated it further by down-sampling the image (that is, checking only every other pixel, or every fourth). The biggest issue we had was localization using the camera; the accuracy we achieved using things like blob size and offset within the image was not high enough to be useful, varying by as much as a meter while the robot sat still.<br />
Our final vision code was dramatically simpler than our original intention. Although we initially implemented several advanced features, including Hough transforms and confidence calculations, they were all ultimately removed in the interests of speed and because they didn't offer any particular advantage in the context of the contest. Red or green meant ball, yellow meant goal, and blue meant the top of a wall – anything more complex was largely irrelevant. <br />
That was one of the major lessons learned during MASLAB: however cool or interesting a feature may be, if it doesn't contribute to performing in the contest, it probably isn't a good idea. There's nothing wrong with trying exciting new things, but the time constraints mean that you'll have to pick and choose which exciting things you do – and generally, the ones you want are the ones that will help you win.<br />
<br />
'''Omniwheel Drive'''<br />
<br />
The same lesson applies to our drive train. We chose to build a four-wheel omni-wheel system for enhanced mobility; four wheels were chosen because of the natural symmetry of the robot, and because of size constraints, the motors were mounted above the wheels and connected by chains. Although we were successful in building this complex drive train, it was plagued with issues – the weight of the robot meant the wheels needed to be mounted on steel shafts to avoid excessive flexing, the low wheel-base meant the robot occasionally dragged on the floor, and the number of motors meant we could not use the orc board to control the drive system. <br />
As it turns out, the omni-wheel system didn't even offer us much of an advantage; because of the nature of the contest, robots were almost forced to use 'turn and go' navigation, rather than executing the complex paths the omni-wheels would allow. Had we used a conventional two-motor drive train, we could have avoided a lot of work with minimal impact on practical mobility, and used the extra time to implement things that could have greatly improved our performance in the contest.<br />
<br />
'''Modularity'''<br />
<br />
The final point I would like to make about our robot is its modularity. I would definitely recommend this to future maslab teams when designing. Originally our robot wasnt modular at all. Our first build took us over an hour to finish (Check out a time lapse of our original build here: http://www.youtube.com/watch?v=8_epFGdMhL8). Eventually we cut our robot down and added more brackets and supports to make a modular design. Our robot then consisted of three main components that could be separated with ease to, for quick access fixes and changes. We cut our construction time by over 50%. The three components were our left and right wings, and then our center console. The left and right wings were composed of the front and real wheel assemblies, each of which consisted of a motor mount, a wheel mount, and a chain and sprocket system. These could be bolted onto the sides of the main compartment, which was 7' wide and flat on both sides. This helped us dramatically when working on our robot, and it ended up being one of our proud points in the design.<br />
<br />
However, despite our modularity, it ended up being too little too late. An ideal module system would be for the wheels to be on one module, the sensors on another, and the ball collecting aparatus on a third. This way the coders couldve been testing omniwheel code and sensors code while we finished the build. As it turns out, the coders didnt get the fully funtioning robot until about 6 days before the final competition. This certainly was cutting it close. But then the problems with weight and the robot dragging on the floor, and the failure to identify these problems, meant that the coders never finished their code, and our robot ended up driving straight and blind at the final competition. Moral of the story: make your designs modular so every one can be working at once.<br />
<br />
== Final Outcome==<br />
<br />
Unfortunately our robot did not seem to come together like we had intended. On the day of the final competition we simply did not have a robot ready for the task at hand. For the preliminary round, we lost in both 3 minute sub rounds, scoring zero points. Come the second round we stepped our game up a bit collecting one ball and displacing another. after dropping the one ball we collected, however, we still ended up with zero points. We ultimately came in second to last place. Despite our limited success both my team and I can certainly agree that experience was one filled unmatched learning opportunities. It is an entirely hands on project that will force those who are not experienced in the field to learn new skills they will undoubtedly be able to apply in the future. I recommend this to anyone interested. Voltron force, out!<br />
<br />
== Photo Gallery ==<br />
[[Image:robot1.jpg |thumb|200px|CAD View 1]]<br />
[[Image:robot2.jpg |thumb|200px|CAD View 2]]<br />
[[Image:robot3.jpg |thumb|200px|CAD View 3]]<br />
[[Image:isometric.jpg |thumb|200px|Robot View 1]]<br />
[[Image:frontview.jpg |thumb|200px|Robot View 2]]<br />
[[Image:backview.jpg |thumb|200px|Robot View 3]]<br />
[[Image:lasercut.jpg |thumb|200px|Lasercutting dxf]]<br />
[[Image:construction.jpg |thumb|200px|Mid construction]]<br />
[[Image:table.jpg |thumb|200px|Our workspace]]<br />
[[Image:firstdrive.jpg |thumb|200px|Maiden voyage]]<br />
[[Image:prindle.jpg |thumb|200px|Swag Master Flex working hard]]</div>Green Lionhttps://maslab.mit.edu/2011/wiki/FinalScoresFinalScores2011-02-01T00:26:49Z<p>Yichen: /* Seeding Rank */</p>
<hr />
<div>The final competition is a double-elimination tournament seeded two days before the final competition. Teams that seed well are given byes in certain rounds. It order to make the final competition a reasonable length, several rounds are run a few hours before the final competition. The final runs are completed in front of an audience and every team runs at least once.<br />
<br />
Read about each team in the<br />
[[http://web.mit.edu/6.186/2011/Lectures/Maslab2011Program.pdf Competition Program]]<br />
<br />
== Seeding Rank ==<br />
The seeding rank of the teams are listed below. The ties are broken by number of balls displaced and then by robot weight (lightest robot seeds higher).<br />
<br />
#Team 2 (75 points, 15 balls displaced)<br />
#Team 13 (23 points, 19 balls displaced)<br />
#Team 3 (12 points, 9 balls displaced)<br />
#Team 11 (11 points, 6 balls displaced)<br />
#Team 7A (5 points, 8 balls displaced)<br />
#Team 1 (2 points, 2 balls displaced)<br />
#Team 10 (2 points, 2 balls displaced)<br />
#Team 7B (1 point, 1 ball displaced)<br />
#Team 6 (0 points, 0 balls displaced)<br />
#Team 9 (0 points, 0 balls displaced)<br />
<br />
== Tournament Rounds ==<br />
The final tournament consisted of 23 rounds (some of which are byes) with a winners bracket and a losers bracket. <br><br />
[[File:TournamentRounds.png]]<br />
[[File:FinalRounds.png ]]<br />
<br />
== Scores ==<br />
The scores for each of the rounds above are listed below. For rounds with byes, the scores are marked with zeros. Round 22 started with a robot failure so only one robot ran the first round. When it became apparent that the other robot was not fix-able in time, the working robot was advanced. <br><br />
[[File:Scores.png]]</div>Yichenhttps://maslab.mit.edu/2011/wiki/Team_Thirteen/Final_PaperTeam Thirteen/Final Paper2011-01-31T20:31:01Z<p>Rhan: </p>
<hr />
<div>'''overall strategy'''<br />
<br />
Two weeks into Maslab, January 14th, we decided to radically change our strategy and robot from a simpler design for goal-scoring to a complex and roller-intensive launching mechanism for scoring over walls.<br />
<br />
Given the nature of this year’s competition, there was a clear dichotomy in design plan: namely, a goal scoring mechanism versus a wall scoring mechanism. No team built a robot that could do both. Logically, if one could construct a sound design for scoring over walls, it was entirely advantageous to do so and entirely disadvantageous to try scoring in goals.<br />
<br />
We originally planned to only score goals for easier mechanical design. However, we realized that while our exploring and ball-collecting abilities were on par (if not better) than those of our competitors, we could only hope to earn 4 points per ball as opposed to their 6.<br />
<br />
'''mechanical design and sensors'''<br />
<br />
Review of previous literature emphasized a few key points in mechanical design. Robustness, a small (round) footprint, and as few complicated moving parts as possible, all figured prominently in the design of previous winners.<br />
<br />
Our first robot fit the criteria. Although it was more square than round, it was essentially two plates of acrylic with supporting walls. The motors directly drove the two wheels mounted on either side, and a roller mechanism in front sucked balls into the body of the robot where a ramp reliably funneled the balls into the back. A servo attached to a sheet of metal opened and closed to score balls into goals. The only flaw was that we were unable to score balls over walls.<br />
<br />
A.W.E.S.O.M-O was a far more complicated design. It was larger, elliptical, and had a launching mechanism consisting of home-cut gears turning rollers which brought one ball at a time up a ramp and ideally, over the wall. Though perfectly sound in theory, the practical implementation of this design was limited by the unevenness of the gears which consistently jammed. Eventually, we settled with a single belt that ran over the rollers (reducing the number of gears which had to synchronize and turn), which seemed to work slightly better at propelling the balls up the ramp.<br />
<br />
When deciding the lift mechanism, we had to consider whether we wanted a ground-level or wall-level hopper; in retrospect, a wall-level hopper would have made scoring easier and more reliable. At the time, that was vetoed in favour of a LIDAR system (with long-range IR sensors) we had planned to implement.<br />
<br />
Ultimately we did not construct our LIDAR because mapping became less necessary given our reliable wall-following and navigation. Our navigation system was remarkable, considering that we only used four short-range IR sensors. Two were mounted diagonally in the front left and front right, scanning for obstacles ahead in case the robot wanted to “turnLeft”, “hallFollow” or “wallEnd”. The other sensor was mounted in the right side, in the back, and used to maintain distance from the wall for our “hugRight” state of wall following and to detect a “wallEnd” scenario.<br />
<br />
The camera was only used for ball detection and goal detection. We originally considered implementing stereo vision with two cameras but the unreliable FPS output (ranging from 7 to 30 FPS for no apparent reason) was a deterrent. We also briefly toyed with quadphase encoders on our first robot, but the getDistance() method returned irrational and nonsensical readings. Although we had dead reckoning and velocity inputs on our first robot, and they were more convenient to use, we were ultimately satisfied with the quality of A.W.E.S.O.M-O’s navigation as it was.<br />
<br />
'''software design'''<br />
<br />
In terms of software our robot was quite simple and robust. Our robot, A.W.E.S.O.M-O, had a default state of wall following, but would break behavior, and start to approach a ball or yellow wall if one was detected. Once an object of interest was detected (ball or wall) the robot would approach using a simple PD control loop. After each collected ball or wall scored the robot would enter a scan state, that would break upon seeing a ball. With this behavior we hoped to successfully explore the entire field, and in practice we seemed to do so.<br />
<br />
To make programming the robot easier for all members of the team , we crated HAL (the part of the robot that developed consciousness). HAL is short for hardware abstraction layer, and served as a communication link between the state machine and the robot. HAL handled things like overriding the watchdog, velocity to pwm conversions, voltage to distance conversions for rangefinders, and all other robot inputs and outputs.<br />
<br />
Since our robot was heavily wall follow reliant we decided to make our wall follow state a state machine as well. The wall follow state machine had the following 4 states that handled any scenario possible in the maslab world: “hugRight”, “turnLeft”, “hallFollow”, and “wallEnd”. In the end we did not implement hall follow as our hug right stayed close enough to the wall that we deemed it unnecessary.<br />
<br />
The goals presented a slight problem for our wall following code. Although the edges were yellow the goal itself was black and the camera interpreted that as empty space. Therefore, the robot tried to turn into the goal every time. Eventually we added a panic response; if the camera saw a goal 12 inches or closer, it would abruptly veer away from the goal. Impressively, if the robot had been wall following it would veer back on course after panicking.<br />
<br />
We also had stall detection implemented in vision. If the frame did not differ enough over a certain time the robot would assume it was stuck and would run a freak out function. This feature was a little buggy. When the robot would traverse a large enough room, the subsequent frames would be similar enough to make the robot think it was stalled even though it was crossing a room as wanted.<br />
<br />
'''overall performance'''<br />
<br />
During the competition our robot performed admirably. Despite two burnt motors and a broken camera we placed 4th. In terms of software, we benefited greatly from our freak out function. Like other teams, our robot was sometimes confused by the field and stuck in a rut. However, having a freak out function allowed us to essentially “reset” what the robot was thinking and resume activity. Our mechanical design was the weakest point in our robot, with the scoring mechanism jamming or malfunctioning almost every round.<br />
<br />
'''conclusions/suggestions for future teams'''<br />
<br />
* DO NOT CUT YOUR OWN GEARS!!! Despite being warned by multiple people we tried it anyway and paid the price by having our gear train jam at the worst possible times.<br />
<br />
* On a mechanical note, simpler is better. The more quickly and easily a robot can be put together / taken apart, the more likely one can fix unexpected last-minute issues.<br />
<br />
* Use SVN or some form of version control. Two weeks into Maslab our eeePC kernel panicked and we almost lost all of our code. Luckily we got it back but we still lost 2 days of work time.<br />
<br />
* Plan your state machine and software architecture. Because we had planned everything out we were able to write and test each state separately before putting everything together. This allowed us to find bugs faster, and helped us maintain clean, easily changeable code.<br />
<br />
* Gearing down motors is not that beneficial, the motors provided are powerful enough to drive most robots and speed becomes a critical factor towards the end of the competition. Faster robots collect more balls and explore more of the field.</div>Rhanhttps://maslab.mit.edu/2011/wiki/Team_One/Final_PaperTeam One/Final Paper2011-01-31T04:07:52Z<p>Allanm: </p>
<hr />
<div> '''<nowiki> Magnetometer, but I hardly</nowiki>'''<br />
<br />
<nowiki> The competitive design of MASLAB 2011 presented a unique set of challenges. Our team decided that a more interesting experience could be had by trying to shoot balls over the walls rather than into goals. Our initial plan was to lift balls up and shoot them between two high speed rollers over the wall. The robot would wander around the map following a right hand rule while disengaging to pick up balls as it saw them. While ambitious, we were confident that we could synchronize each system of the robot. As time progressed, the scope of our mechanical and digital robot changed more simply integrate with the map.<br />
At a basic level, our robot entailed a conveyor belt attached to an elevated gate. Two large wheels provided drive and tank-style steering while the weight of the robot balanced between four caster wheels. Our orc board and battery were mounted on top of the bot while the laptop was mounted in the middle. The bottom layer of the robot functioned as a collector. Balls could be run over from the front side of the robot and where funneled back to a vertical conveyor belt. This belt sandwiched balls between a rolling cloth and rubber band belt and a foam backing. Upon reaching the second story of the robot, balls would roll down into a gated area. After butting up against a wall, the gate could be opened, releasing the balls across the wall.</nowiki><br />
<br />
<nowiki> The bulk of our robot was constructed from acrylic, with some wood support. Our team made liberal use of the laser cutter to build a frame, bumper buttons, and second story ball release gate. Wood was inserted to brace our bumpers and serve as axles for the ball lifting rollers. The roller was powered by a stepper motor which offered the torque necessary to lift the balls and withstand tension from the rubber bands. In over to overcome bumps on the map floor and the added torque loss from the larger wheels, our driving motors were heavily geared. The top ball release gate was a simple servo which lifted and dropped a wooden pole.</nowiki><br />
<br />
<nowiki> Input relied on a combination of touch, IR, and visual sensors. A webcam was mounted on the front side of the lower level, below the laptop. On the ground level two rounded shoulder buttons relayed data about physical contact with the world and provided a simple means of determining if we were perpendicular to a wall. The robot also sported two infrared sensors to provide data into the wander code. One sensor was placed on the front right of the robot while a center facing sensor was mounted to the left center of the robot. A break bream sensor was implemented to count captured balls, but was later removed for simplicity. </nowiki> <br />
<br />
<nowiki><br />
Our code was designed from the top down: we wanted to parallelize our code design as much as possible (and this is recommended, as it keeps each programmer working on an encapsulated system), so we chose to implement a logic-level splitting system.</nowiki><br />
<br />
<nowiki>Essentially, sensor data is fed into a high-level analyzer (FSM) that collects both data and uncertainty to estimate the robot’s current state, and from this estimate generates a suitable low-level Behavior. These low-level Behaviors included Wander, Shoot, Evade, Gather, and Stop (with Stop being triggered by a global time-out). Each low-level Behavior is permitted to update motor values according to the desired behavior (i.e. run our roller motors when we want to collect balls, but not when we’re running away from a wall). We agreed that each individual low-level behavior call should not block if at all possible, because the main loop blocks until the low-level behaviors are complete. The faster executed simple sub-routines, the faster we can process data.</nowiki><br />
<nowiki><br />
Data was acquired from the two touch sensors mounted to the front bumper, a front-facing IR sensor, a right-facing IR sensor, and the main camera. Processing a single camera frame at 160x120 pixels in Java proved to be unfeasibly slow (this was later determined to be an issue with BufferedImage performance on machines without graphics cards; more details can be found HERE: http://www.jhlabs.com/ip/managed_images.html ) so we went in search of image processing libraries. Ultimately, we decided on a JNI’d implementation of OpenCV, a well-known computer vision library written in C. Note to future teams: if you’re going to use JavaCV, the JNI wrapper to OpenCV, make sure that you compile OpenCV WITHOUT SSE instructions.</nowiki><br />
<br />
Our image processing pipeline acts on an image as follows:<br />
1) Blur the image with a Gassian blur to remove obvious noise<br />
2) Convert the image to HSV, and apply individual band-pass filters to each channel to generate various color-masks (red/green, yellow, and blue). This was accomplished using thresholds generated from test pictures.<br />
3) Apply morphological erosion and dilation to the image with a 5x5 disc structuring element. This has the effect of removing the inevitable noise from various carpet fibers in the blue channel.<br />
4) Apply a blue-line filter to the image, as described in the vision tutorial.<br />
5) Find the contours in each channel to perform connected component labelling.<br />
6) In the red/green channel, report the discovery of a Ball if and only if we find a relatively circular contour above a minimum area.<br />
7) In the yellow channel, report the discovery of a wall if and only if convex hull of the contour has area approximately equal to the area of the contour. Report the discovery of a goal if and only if the convex hull area is significantly larger than the internal area.<br />
8) If the centroid of any reported object is above the top third of the screen, ignore it. This deals with blue-line filtering failures, as well as other strange corner cases.<br />
<br />
The highlevellogic package contains the FSM class. The FSM class’ functionality can generally be described as:<br />
* Function Wander Ball Gathering End wallAway yellowWallScore<br />
* State 0 1 2 3 4 <br />
* t>3min 2 2 2 2 2 <br />
* touch 3 3 2 3 4<br />
* SeesBall 1 1 2 1 4<br />
* SeesGoal 4 4 2 4 4<br />
<br />
<nowiki>There are a couple of exceptions. If we remain in wander state for 40 states (about 5 seconds) we go into the wallAway state. If we have been in Ball Gathering state for more than 5 states (about 0.5 seconds) then we will remian in Ball Gathering state for another 5 states. Lastly, if we remain in yellowWallScore state for longer than 40 states then we will go into wallAway.</nowiki><br />
<br />
<nowiki> The lowlevellogic package contains the classes WanderingState, ShootingState, EvadingState and CapturingState. WanderingState uses the IR sensor data to right hand rule around the field. ShootingState uses a PI controller to turn towards a wall and then drive forward. EvadingState backs up and turns left. CapturingState uses a PI controller to turn towards a ball and then drive forward, if both touch sensors are depressed then we lift the gate to release stored balls.</nowiki><br />
<br />
<nowiki>The physicalObjects package contains the classes Ball, Goal and Wall. These classes are useful because they store the relevant information gathered from vision processing.</nowiki><br />
<br />
<nowiki> While most systems worked, it proved critical to ensure proper ball flow. While our code worked, drive mechanism, and ball gate worked well, the failure of the rollers prevented our ability to score. Our robot was effectively able to wander and possess balls. Another area which proved problematic was the touch sensor placement. On multiple occasions our robot’s touch sensors failed to engage and once caught our robot inside a goal. Furthermore, the decision to use two larger wheels made motion harder on an uneven field.</nowiki><br />
<br />
<nowiki> When structuring work, teams should not have mechanical and coding sides work independently. When the two sides of development do not work hand in hand, one side can begin to outpace the other. The need to have a robot mechanically developed early on cannot be understated. The whole team should attempt to work together to create the core of the robot before diverging into fine tuning and coding.</nowiki><br />
<br />
<nowiki> Furthermore, those teams who try to implement a roller system should avoid rubber bands as a conveyor mechanism. Instead try to get a large belt such that balls are unable to slip through and adequate tension exists with which to grab balls. It is recommend that priority be given to make the length of the conveyor be as small as possible in order to require the least work possible from the turning motor.</nowiki><br />
<nowiki><br />
As always, teams should enjoy their work and try learn as much as possible during the month.</nowiki></div>Allanmhttps://maslab.mit.edu/2011/wiki/Team_Seven/Final_PaperTeam Seven/Final Paper2011-01-31T03:53:19Z<p>Rafacb: Created page with "== MASLAB 2011: Final Paper == Roberto Meléndez, Christian X. Segura, Rafael Crespo, and Javier E. Ramos This paper serves as an introduction and technical description of our..."</p>
<hr />
<div>== MASLAB 2011: Final Paper ==<br />
<br />
Roberto Meléndez, Christian X. Segura, Rafael Crespo, and Javier E. Ramos<br />
<br />
<br />
This paper serves as an introduction and technical description of our robot,<br />
the Mighty Duck. The paper discusses: overall strategy, mechanical design and<br />
sensors, software design, overall performance, and future suggestions. It is<br />
informal, but as complete as possible to provide guidance to future Maslab teams.<br />
<br />
<br />
== Overall Strategy ==<br />
<br />
In terms of the general and high-level approach to the challenge, our team decided<br />
to develop a simple, yet very maneuverable robot. This is mainly due to the fact<br />
that we only had one programmer, and three MechE’s. We decided to score by<br />
throwing the balls over the fence. Our approach was mechanically simple and<br />
designed to be easy to assemble.<br />
<br />
From a high-level design standpoint, our robot was driven by two geared motors<br />
with a caster wheel in the back. Our mechanical strategy was to collect the balls<br />
using a mechanism that collected the balls by pinching them between sets of<br />
rubber-bands placed in parallel. Our robot was designed to place the balls above<br />
the fence separating the teams to score the maximum amount of points possible.<br />
<br />
<br />
== Mechanical Design ==<br />
<br />
As mentioned above, our robot employed a rubber-band basket mechanism to<br />
collect the balls. The basket was attached to rotational arms that raised and<br />
lowered the basket on top of the balls. As the basket came down, the balls got<br />
pinched between the rubber-bands placed on the underside of the basket. The<br />
basket would then rotate up and let the balls drop through the back of the robot<br />
using the computer as a ramp. The strategy was to collect the balls and then have<br />
the robot back up to the fence. After the robot was properly aligned, the robot<br />
would then deposit the balls.<br />
<br />
[[File:robot.jpg]]<br />
<br />
Also, we designed and built our own wheels, to benefit from improved traction over<br />
the stock ones. The robot employed a servo to rotate the arm and most of the<br />
weight was placed close to the ground for improved stability. The circular design of<br />
the base prevented the robot from getting caught in corners.<br />
<br />
It is important to note that our team built two robots. The first robot was built<br />
during the first 2-week period, and the second robot was built during the last week<br />
of the competition. Our decision to build the second robot was based on that we<br />
couldn’t score that maximum amount of points with the first robot. We could only<br />
dribble balls and place them in the floor goals. Even though we built a second<br />
robot, the first robot served as a coding and prototyping base for the team. It is<br />
important to choose a strategy from the beginning and stick to it. It was very<br />
costly for our team to start building a new robot.<br />
<br />
<br />
== Code ==<br />
<br />
The code for written by our time was focused on simplicity. The team had only one<br />
programmer which meant that the time spent on vision, navigation and the<br />
logistics of the code had to be split up and was probably the main reason why the<br />
robots ended up underperforming.<br />
<br />
<br />
== State Machine ==<br />
<br />
The state machine used for both of our robots was similar. As the Mechanical<br />
Design shows, one of our robots could only go up to balls and pick them up<br />
while the other had the ability to grab them and then throw them over a<br />
wall.<br />
<br />
The first robot we built had only the ability to store balls and score in goals.<br />
Since the scoring mechanism was not very reliable it would only try to score<br />
during the last 40 seconds of the match. Even though the robot did not<br />
score consistently, the navigation and grabbing balls actions were consistent<br />
and incredibly efficient. Here is a rough version of the state machine:<br />
<br />
[[File:statemachine1.jpg]]<br />
<br />
The second robot was a bit more complex in its behavior. It was capable of<br />
scoring over walls with some success and this is why we decided to make a<br />
state machine that would prioritize ball grabbing only if the robot did not<br />
have any balls already stored and scoring otherwise. Since we did not have<br />
time to accurately tell whether the robot had in fact grabbed a ball, we<br />
decided to make the robot try to score if the time limit was approaching.<br />
Here is how the state machine looked like:<br />
<br />
[[File:statemachine2.jpg]]<br />
<br />
<br />
== Vision ==<br />
<br />
One of the teams’ biggest flaws was the vision code. As suggested by the<br />
TA’s, we tested and made the vision of the robot an early priority. Sadly, we<br />
never got it to work consistently or had enough time to test it in real life<br />
action. It is important to point out that even if the pictures are taken in 26-<br />
100, once you start testing your code while the robot moves, the results<br />
become less precise.<br />
<br />
The code consisted of looking for balls, walls and goals. The robot could<br />
easily tell balls apart from the rest of the elements seen on the picture since<br />
they were either red or green, but when it tried to distinguish goals and<br />
walls it would be inconsistent since both where yellow. The differences<br />
between a goal and a wall where minor, so it is very important to put some<br />
time aside just to tackle that problem. It is also crucial that your vision code<br />
is efficient. We managed to do process images with only one iteration of the<br />
picture’s pixels which allowed our robot to make decisions at a rapid rate.<br />
<br />
<br />
== Navigation ==<br />
<br />
For the navigation part, we did not use any fancy PID controller. We went<br />
for simplicity and efficiency and that is what we got. We used 3 IR shortrange<br />
sensors and did a simple wall follower to navigate the map. It did<br />
remarkably well and explored most maps to completion. The only problem<br />
with this strategy is that if one of the sensors fails, then your robot will not<br />
navigate correctly, which is what happened to us in the final competition.<br />
<br />
<br />
== Results and Recommendations ==<br />
<br />
It is highly recommended that you give your coder(s) time to do their work<br />
and that you have at least 2 coders. Our team lacked man-power in this<br />
department and thus our performance was heavily affected since we did not<br />
have time to perfect neither the vision nor the navigation code. We would<br />
also recommend writing very simple navigation code. No PID controller is<br />
necessary unless you are looking for precise movements. The gyro is not<br />
very useful (open loops worked better than gyros in our case…) so don’t use<br />
them! Finally, prioritize on the vision code! Without a good vision code, you<br />
robot will not work well!<br />
<br />
<br />
== Final Conclusion and Remarks ==<br />
<br />
The robot performed remarkably well during the tests, given the time constraints<br />
(1 week to build a new robot), but did not perform as expected during the final<br />
competition, which was a bit disappointing. The main suggestion we have for<br />
future teams is to build early to have a plenty of time for testing and tweaking.<br />
Choose a strategy early, or before the competition and implement it as fast as<br />
possible. Also, there are many things that can be done before the competition,<br />
such as writing code, making parts, and deciding on a navigation strategy.</div>Rafacbhttps://maslab.mit.edu/2011/wiki/Team_Eleven/Final_PaperTeam Eleven/Final Paper2011-01-31T02:28:29Z<p>Kranders: /* reFuses's Design */</p>
<hr />
<div>== '''MASLab Final Report''' ==<br />
<br />
== Workload Breakdown ==<br />
Kristen- Sensor/Controls/Behavior<br />
<br />
William- Vision/ Behavior<br />
<br />
Tim- Mechanical Design<br />
<br />
<br />
== Table of Contents ==<br />
<br />
Overall Strategy<br />
Mechanical Design<br />
Sensor/Actuators<br />
Software Architecture<br />
RobotMain<br />
Stop<br />
BallGrabber<br />
Navigator<br />
getState<br />
forward<br />
rightTurn<br />
leftTurn<br />
rightPID<br />
leftPID<br />
bumpLeft<br />
bumpRight<br />
WallScorer<br />
Data <br />
ImageProcessor<br />
Conclusion<br />
<br />
----<br />
<br />
== reFuses's Design ==<br />
<br />
'''Overall Strategy - WIN! ''' <br />
<br />
We decided to put the balls over the walls. We wanted to be able to hold a number of balls and reliably capture and deposit the balls over the yellow wall. For the general behavior, we wanted the robot to circle every so often and look for balls and walls, collect balls that it found and deposit the balls over the walls found. If, however, neither ball nor walls were in the robot’s sight when it spun, the robot would then attempt to wall follow for a certain amount of time before spinning and looking again. In addition, the robot needed to be awesome and fast.<br />
<br />
----<br />
<br />
'''Mechanical Design'''<br />
In past years, the primary component of MASLAB has been around collecting balls and moving them to a designated scoring area. This year the competition changed slightly and allowed the option of scoring balls either by placing balls in a designated goal, or by placing the balls over a specified wall and onto the opponents playing field. This secondary scoring option is by design more complex, and so makes the design process a bit more drawn out. For our robot the additional points gained through placing balls over the wall was determined to outweigh additional complexity or design time. With this strategy we moved the design forward and broke it into two main components: the chassis and the ball handling system.<br />
<br />
Least Critical Module<br />
The least critical module, LCM, of our robot was the chassis. Its primary role was to be acted the skeleton upon which the robot hardware would be attached. It was composed of laser cut acrylic sheeting, which was connected in a tongue and groove system. These sheets where held together by 4-40 screws and the application of epoxy at the joints. The details and features of the chassis structure were not finalized initially and instead the design was created by iterations in design features. By doing this we were able to independently modify the drive system, ball collection system, and the required sensor mountings. As the final week approached we brought all those systems together and laser cut a brand new robot to which we had implemented all the design changes we required from previous iterations.<br />
<br />
Most Critical Module<br />
The most critical module, MCM, for the mechanical structure was the ball handling system which was borrowed from MASLAB robots of previous years. First, we utilized a revolving rubber band hub collection mechanism that was used both by past competitors and a few teams from this years competition. This mechanism allows for rapid ball collection, and the ability to move those balls into a passive storage container.This seemed as an ideal solution to this years competition, but because we were aiming to place the balls over a wall, it was decided that we would have to modify the rolling rubber band hub. We determined that by using the collection mechanism to also elevate the balls to a height of 8 inches we could solve two problems with one solution. By increasing the diameter of the revolving rubber band hub and placing it inside of a static hub, we created a elliptical gear system. In this system the rubber band hub acts as the sun gear, the stationary hub acts is the annulus, and the balls to be collected acted as planetary gears. As a secondary aspect, we borrowed the capture and release mechanism for our robot from other groups who would be using gravity to aid drop the balls over the game wall. The ball collection system already brings the balls to a height of 8 inches and allowed for the creation of a sloped storage container to store balls. By attaching a servo controlled door at the lowest point of the sloped storage chamber we were able to provide an outlet channel for the stored balls. Upon operation the servo activated a trap door that was lowered, allowing for the balls to travel over the field wall and onto opponent game space.<br />
<br />
The Final Robot<br />
As a whole system picture 1&2 show the exact shape and form for our robot, both the front and back are shown for clarity.<br />
<br />
Pictures 1 & 2: Credit to Sam Range for his photography, (left) back view, (right)front view<br />
In the picture to the left you can see the back of the robot with a large amount of tape used to hold the electrical components in place, along with the 12V battery, and the rear bump sensors, and the servo controlled door that was lowered in place to allow the balls over the field wall.<br />
On the right you can see the front of the robot, pictured are the camera, front bump sensors, the MCM ball collection mechanism, and the computer being used to process all the data. In all the robot came to about 10 lbs (dependent upon the number of balls) and covered about one square foot in area.<br />
<br />
----<br />
<br />
''Sensor/Actuators'' <br />
<br />
Our mechanical design required an extra drive motor and servo as extra actuators in addition to the wheel motors. The drive motor was used to spin the roller and the servo to open the trapdoor, which cost 7+5 points. We wanted to follow the wall, for which we used the Long IR sensors. Long IR sensors were chosen due to the speed at which the robot approached, the longest distance on the short range IRs were too close and the did not have enough time to turn and consistently hit the wall. We needed the gyro to turn a set distance reliably and look for the balls, in addition to turn 180 degrees and score. The camera was obviously needed to identify the balls and yellow walls. The bump sensors were needed to safely line up with the yellow wall, as well as to avoid getting stuck. The front bump sensors, specifically, protected against the robot approaching a wall too close where the front IR sensor was out of range, and thus returned an incorrect value. In hindsight, it may have been useful to use an laser motion sensor to see if we were moving or not, as we tended to get caught and not necessarily hit a bump sensor.<br />
<br />
'''Sensors/Acutators Used and Corresponding Sensor Points''' <br />
<br />
0 pts 4 BumpSensors (two front, two back)<br />
12 pt 3 Long IR sensors (one left, one front, one right)<br />
0 pts 1 Camera<br />
0 pts Gyro<br />
7 pts Extra Drive Motor<br />
5 pts Servo<br />
24 total pts < max 30 pts<br />
<br />
'''Software Architecture''' <br />
<br />
The software had three main parts, the behavior (RobotMain), the orc interface (Data), and the image processing (ImageProcessor). <br />
<br />
'''''RobotMain''''' <br />
<br />
RobotMain was the main class that created all classes, initialized the gyro, and started the behavior finite state machine (FSM) when the power button was pushed. The behavior FSM consisted of a Stop, BallGrabber, WallScorer and Navigator state. Each of the states were passed into Data, which allowed them to access the orc board and thus all the sensors and actuators.<br />
<br />
'''''Stop'''''<br />
<br />
The Stop state was the central state in the state machine and was used in transitioning between the other states. Upon entering this state, the robot would stop, and turn around on itself a certain amount of times, capturing and analyzing a picture to find balls and walls after each turn. It would then transition to the appropriate state according to what was revealed by the picture. Generally, if it found a ball, it would go into BallGrabber state. If it found a wall and decided it wanted to score, it went into WallScorer. And if it didn’t find anything after the X number of turns, it would go into Navigator. This state also enabled us to completely decouple our strategy from the whole code, as we had a separate function to determine when to score. Therefore, whenever we found a wall in a picture, we would call that function which returned true if we wanted to score and false otherwise.<br />
<br />
'''''BallGrabber''''' <br />
<br />
When the robot entered the BallGrabber state, it continuously takes pictures and processes them with ImageProcessor. The software takes the angle error calculated with the ImageProcessor and ran a PID controller. It continued in this loop until the camera no longer saw the ball. At which point, the robot continued forward for one second to ensure that the ball made it into the ball collector. There is a ball count contained in Data, which the code at this point increments. If at anytime the robot no longer sees the ball, it enters the Stop state again.<br />
<br />
As our strategy was to go as fast as possible, the PID controller worked in such a way that when the robot wanted to move left the left motor would slow and the right motor would continue going at it’s maximum. The same worked for the right motor. As the camera angle is very small this worked fairly well as the error was never very big. Unfortunately, the controller was not fast enough for a couple of balls that were just in the camera’s line of sight, however this limitation wasn’t much of a hindrance. The motors were close enough speeds that using the same gains for both the left and right motors worked well. <br />
<br />
BallGrabber had a timeout in it, that performed a subroutine contained in Data. This subroutine is used in Stop and Navigator for when it hit a bump sensor and essentially backed up and turned. After this subroutine was finished the FSM entered Navigator.<br />
<br />
As our robot’s mechanical design limited it in it’s ability to pick up balls next to walls, this state contained a function that said if the robot had attempted to pick up a ball three times and hit a bump sensor everytime, the behavior FSM entered the Navigator state and went and looked for other balls. It also subtracts three balls from the balls collected count in Data.<br />
<br />
'''''Navigator ''''' <br />
<br />
The Navigator state is another FSM which contains 8 states, getState, forward, leftTurn, rightTurn, leftPID, rightPID, bumpLeft, and bumpRight.<br />
<br />
''getState'': When the behavior FSM enters the Navigator state, it pulls all of the IR sensors. If the front sensor is less than some minimum the robot enters leftTurn or rightTurn depending upon which sensor reads something closer. If one of the left or right IR sensors are within a certain range then it enters left or right pid correspondingly. If all of the sensor read far away it goes forward. If either of the bump sensors are pressed it enters that corresponding state, bumpLeft or bumpRight.<br />
''forward'': This state makes the robot continue to go forward until one of the sensors is within range at which point it enters the same state as described in getState.<br />
<br />
''leftTurn'': In this state the robot continues to turn until the front sensor reads greater than a certain number and the right sensor reads greater than a certain number, at which point it enters rightPID. If a bump sensor is hit, the Navigator FSM will also exit to the appropriate bump state.<br />
<br />
''rightTurn'': This state behaves in the same way as leftTurn only reversed.<br />
<br />
''leftPID'': In this state the robot is wall following using the left IR sensor. When the computer enters this state, the left_dist is set to the IR’s current reading. The error is then caluclated from the current IR reading and left_dist. This controller used similar logic as the BallGrabber, where it slowed down the appropriate motor. This technique gave quite a bit of leeway in the PID controller and allowed for the noise in the IR sensors. To exit this state, the front IR sensor needs to be less than a certain minimum, in which case it enters a right turn. It will also exit to the turn state if the right sensor returns less than the minimum. It will always exit if a bump sensor is hit. <br />
<br />
''rightPID'': This state behaves in the same was as leftPID only reversed.<br />
<br />
''bumpLeft'': This state calls a subroutine contained in Data. Originally, the subroutine was only in navigator, but it is also useful for other states as well so it was moved to Data so all Main state could access it. It is the same one described in the timeout of BallGrabber. Essentially the robot backs up and turns.<br />
<br />
'' bumpRight:'' This state behaves in the same was as leftPID only reversed.<br />
<br />
<br />
All of the individual Navigator states contained a timeout. In the event of a timeout, the state switches to the forward state. If the forward state times out then it switches to left or right turn. There is also a Navigator timeout. The Navigator timeout moves the behavior state to Stop where the robot stops and looks around. <br />
<br />
'''WallScorer''' <br />
<br />
The WallScorer contains an approach PID controller, a 180 degree turn, a back up, and a stop and drop trapdoor.<br />
<br />
The PID controller is the exact same as the BallGrabber PID. The error is returned from ImageProcesser, in the same units as in BallGrabber, except to the center of the wall. This PID controller is run in a while loop until both front bump sensors are pressed. <br />
<br />
When both front bump sensors are pushed, the robot turns 180 degrees, as determined by the gyro. There is not controller in this loop instead a while loop was written which states that if the gyro reads a number less than the a number that corresponds to 180 degrees keep turning. It worked well, despite the lack of controller.<br />
<br />
After it has turned a sufficient amount the robot backs up. It will continue to back until it either times out, which it did most of time or both back sensors hit, which indicated a perfect allignment. At this point, the robot lowers it trapdoor and waits while the balls fall out. After this ball count in Data is set to zero, and the main state changes to Stop where the robot looks around for more balls.<br />
<br />
There are a couple of different timeout features in this state. First, if the robot times out before both of it’s front sensors are pushed, it assumes that it is stuck and backs up and turns (subroutine in Data). If the robot thinks it has arrived at the wall (aka both bump sensors pressed) it checks and sees if the amount of wall is large enough to be the wall. Otherwise, assumes it is stuck and backs up and turns which is the same subroutine in Data. <br />
<br />
If WallScorer times out in any portion after the first two sensors hit the wall it continues on to the next step.<br />
<br />
It would have been slightly better if when it was approaching the wall if one bump sensor hit, that side’s wheels spun backwards slightly, and the other’s side wheel went full force.<br />
<br />
'''Data''' <br />
<br />
The data class contains all interfacing to the orcboard. It also allows all other states to interface with the orcboard with get, set and update function for all sensors and actuators. Data also contains and global function and variables than need to be known across all states. The most notable function is the avoidBumpLeft() and avoidBumpRight() methods, which simply back up and turn 90 degrees. This turn is based off of the gyro, but it didn’t need to be. It was simply done because we already had the capability and the actual amount turned changed considerably when the battery was low. This class also kept track of time. TimeCompete() was checked at every while loop. This function allowed us to avoid threading. However, if it wasn’t checked at every while loop and on the off chance the robot got caught in that loop, it wouldn’t stop. In hindsight a thread that only checked the time and stopped the main thread if the time was complete would have been more efficient and reliable. We were concerned about the processing power of the eePC, and if we would notice a change if we introduced another thread, but I do not think that this would have been a problem.<br />
<br />
'''ImageProcessor'''<br />
<br />
Our objective was to obtain the maximum amount of information from each picture, even if that meant that it would take longer to analyze them. For example, we wanted to make sure that, if there was a ball in the picture, then it would be recognized, no matter how small it was. Similarly, we aimed at never confusing a yellow goal with a yellow wall, although that meant having a slower image analysis on average.<br />
<br />
The first step was to convert the image to HSV values so as to simplify the process by which we decided of the color of each pixel. Being in HSV made it relatively easy as we only needed to find upper and lower bounds on each of the hue, value and saturation for each color we considered. We then used one-pass connected component analysis to find the various objects in the images. This consisted in basically finding all the clusters of pixels of the same color, starting at the bottom of the picture and making our way up. Going in this direction allowed us to apply blue-line filtering: the top of the walls have a blue line which enables us to distinguish if objects are inside the bounds of the field or not. Therefore, we ignored any pixels above that blue line. We then had multiple methods to accept or reject the remaining clusters as wanted objects.<br />
<br />
For the balls, we first had to find a red or green cluster. To make sure that it was a ball, and not simply an error, we could have simply required the size of the cluster to be large. However, this would have not made it possible to see balls that were further away, a feature which was critical in our overall strategy. We therefore added a test, which ensured that the cluster was of a round shape by looking at the pixels on the perimeter of the cluster.<br />
<br />
For the yellow walls, the main task was to distinguish them from goals. A goal has a black hole in the middle with yellow on the sides and above it, and a thin strip of white above the yellow. Therefore, we simply checked that the amount of black pixels in the area delimited by the yellow was smaller than some threshold. However, in some images where the goal was at some large angle, the amount of black pixels was very small, so we also added a test which checked if there were white pixels on top of the yellow. With these two tests, we were able to perfectly distinguish the two.<br />
<br />
Ultimately, the image processor would calculate the angle needed to turn to get to the closest ball and wall in the picture.<br />
<br />
----<br />
<br />
'''Conclusion''' <br />
<br />
In conclusion, we did fairly well and placed third. We had a couple of problems getting caught, which could have been improved upon. Our major failure mode was the orc board which we managed to short 5 different times. One of these times was in competition, which prevented us in competing in the final round.</div>Krandershttps://maslab.mit.edu/2011/wiki/Team_Ten/Final_PaperTeam Ten/Final Paper2011-01-30T21:26:30Z<p>Wmusial: </p>
<hr />
<div>Members:<br />
<br />
Alex Teuffer<br />
Voitek Wojciech Musial<br />
Youyou Ma<br />
Arvin Shahbazi Moghaddam<br />
<br />
<font color="red">Future maslab participants scan to the bottom for tips</font><br />
<br />
<b>Overall Strategy/Gameplay Goals</b><br />
<br />
The strategy of our team was to keep the robot as simple as possible. We believe that a simple design with the least amount of things that could go wrong is the best. We based the design of our robot on these principles but the overall shape of it was determined by our gameplay goals. <br><br />
Our goal for this competition was to score over the yellow walls and essentially ignore the possibility of scoring in the mouseholes. The reason for us deciding to pursue this strategy was that it permitted us to concentrate on a single high-scoring goal opportunity. By choosing to build a robot adept at throwing balls over the opposing team's wall we can make a relatively simple design while still being able to score high. <br />
<br />
<b>General Preliminary Timeline - none of it actually followed</b><br />
<br />
Week 1 - Finish preliminary robot design and prototype with some simple camera analysis<br><br />
Week 2 - Perfect ball collection mechanism and mapping algorithms <br><br />
Week 3 - Make it better and more reliable without adding too much complexity to any system. <br><br />
Week 4 - Fail week. Leaves time to resort to old working code and have something functional.<br />
<br />
<b>Design</b><br />
<br />
Our robot consists of an underside guiding system, a middle floor, and a ball 'basket' on the top floor. <br />
<br />
The underside guiding system is essentially two aluminum plates that are placed at an angle with respect to each other so that they funnel the balls that go through a one way gate made of light-weight cardboard into the screw lifting mechanism. This had to be planned with precision in order to make sure the two plates did not disturb the two wheel motors or the mice which we used as a substitution for encoders. The door was a one way gate solely made of cardboard and aluminum wire. It was very simple, light and sturdy and could even capture balls against walls.<br />
<br />
Our robot had two wheel drive and was further supported by two casters in the front. The motors were held parallel to the ground with zipties since the weight of the robot was enough to bend the wheel axis at an angle to the ground. <br />
<br />
The bottom floor that held our battery, gpu, uorc board, and hardrive and which also supported the motors, casters, front gate, underside guiding system, and the top floor as well as the screw mechanism was made of simple peg board. The peg board was sturdy enough to not bend under the weight of all these parts and had the additional bonus of having built in holes which we could use to pass wires through.<br />
<br />
The mice encoders were attached to the pegboard with spring suspension which held the mice with epoxy glue. <br />
<br />
the screw mechanism was held above the ground by two aluminum supports that attached to the back end of the pegboard. This caused us many troubles because we could not easily adjust the height of this mechanism which was necessary for getting over bumps in the playing field. In the end, we raised the entire robot by wrapping double sided tape on the wheels and therefore increasing its radius. <br />
<br />
The screw mechanism consisted of the three polyester shafts, one of which held the screw. The two shafts not attached to the screw would support the ball as it would be pushed up by the screw until it reached a spring wrapped around the middle shaft. The ball would be pushed through the spring by the screw and would then use the spring essentially as a slide into the top floor 'basket'.<br />
<br />
The top floor basket had an acrylic base on which were two aluminum walls that funneled the balls into a servo-operated gate which opened and closed on a hinge made at edgerton. <br />
<br />
<b>Machining</b><br />
<br />
Our robot was mostly put together in the maslab 5th floor lab. The archimedes screw was cut out of a pvc pipe using a saber saw. This was done in the edgerton student machine shop. The lathe in the edgerton machine shop was used to slim down the shaft that held the screw so that it could fit into the gears that would turn it in order to pick up the balls. Besides this, the most intesive part of the machining part of the robot was the assembly of the hinge that held the 2nd floor servo gate open. This was also done in the edgerton student shop.<br />
<br />
<b>Software & Strategy</b><br />
<br />
We have initially intended to pursue an (overly) ambitious software effort of stereoscopic vision. We used two cameras mounted vertically, one on top of another. With this choice, the alignment of features across the two cameras becomes straightforward -- they have the same horizontal position. The video streams were captured in c using opencv v4l2 driver. The following algorithms were run on the raw images to obtain a high-level description of the camera scene:<br><br />
- gaussian blur (implemented as 4 separable passes)<br><br />
- rgb to hsl<br><br />
- convolution with sobel kernel <br><br />
- hysteresis <br><br />
- separation of edges into wall & ball edges depending on edge pixel adjacency<br><br />
- aggregate image statistics (count of pixels falling within predefined interest hue/sat/lum regions)<br><br />
- ransac ball fitting (run on ball edges)<br><br />
- line parametrization (run on blue-tape wall edges)<br><br />
<br />
Because of speed considerations, the above algorithms have been implemented using nVidia CUDA and run on a GeForce 9400M GPU. We have managed to achieve overall 12FPS (capture and processing from both cameras).<br />
<br />
Camera calibration has proven to pose the greatest challenge. The position and orientation of each camera is described by 6 parameters (3 spatial coordinates and 3 angles). Total of 12 parameters need to be accurately measured to reconstruct the absolute position of a feature in 3D. Assuming that we only care about the position of 3D features relative to the robot, we need 6 parameters to describe the position and orientation of one camera relative to the other. We have attempted to take camera data of objects whose true 3d position we measured by hand, and used that truth information to fit the 5 parameters. This approach was proven unsuccessful -- the fits did not converge. We then tried a more academic approach outlined here (http://www.peterhillman.org.uk/downloads/whitepapers/calibration.pdf), to no avail. We then gave up on accuracy of camera readings and eyeballed the parameters...<br />
<br />
Due to lack of accuracy on distance reconstruction, random noise, and artifacts of the line fitting algorithm, the features (balls and walls) reconstructed from a set of two camera frames (top and bottom camera) were not reliable enough to be used for robot navigation. I have attempted to collect and average out features across consecutive capture frames. This has worked moderately well provided the robot sat stationary -- the error on camera reconstruction was distance- and angle- dependent, and therefore any change in robot position and orientation, even if measured with the optical mice accurately, would introduce error into matching of features across consecutive frames and would, generally, screw all our efforts up. A possible solution would involve: <br><br />
- analytical modelling of the camera error due to imprecise calibration<br><br />
- brute force error correction: measure error for a grid of points in 3D and correct the position of reconstructed features. <br />
<br />
Unfortunately we run out of time to successfully pursue the 3D vision approach.<br />
<br />
Having stubbornly tried to make the stereo vision work, we have realized 4 days before the impounding we need a different strategy. We then equipped the robot with bump sensors and made it bounce between walls, occasionally looking around for balls using the stereo vision code. We have run out of time to make the code more sophisticated and robust. <br />
<br />
<b>Odometry</b><br />
<br />
We have initially attempted at building and using the proposed optical encoders. Much to our dissatisfaction, the circuit turned out not reliable and of very poor resolution. We then decided to use two optical mice mounted on spring suspension to ensure constant contact with the floor. The two mice were able to measure position as well as angular orientation of the robot very accurately (random error on the angle +- 0.02 rad, position +- 4cm over a meter of distance covered). The mice needed to be calibrated very accurately, though. The mice would often get de-calibrated because of miniature change of their relative position. Also, in order to read mouse raw data we mounted the mice with custom udev rules, which in turn required that we re-plug the mice every time the computer boots. Forgetting about this caveat caused our robot to go out of control during one of the final competition rounds. <br />
<br />
<br />
<b>Final Run and thoughts:</b><br />
<br />
So despite our robot's inability to score, during the final contest, it somehow managed to try to score, though there were not balls on it yet. Also, it was working quite well despite what we were afraid of might happen. However, halfway through our last round, the batteries died and the computer turned off.<br />
<br />
So thoughts for the future:<br><br />
- don't overcomplicate your strategy. I (the coder) prioritized solving the problem of 3d vision, neglecting the very important aspect of robot behaviour until the very last days. Bad, bad bad idea. <br><br />
- if your solution involves a custom motherboard, 12 cells of car engine batteries, 100% more cameras than allowed, and you're not using parts that most of the teams seem to use --- YOU'RE DOING SOMETHING WRONG, and you'll suffer hardware problems that noone will help you with. Either take it or die.<br />
- get back up batteries!!<br><br />
- have behavior code first!! Even if it's simple and you plan on doing something different -- have something working at all first, before you attempt a more sophisticated approach. Last thing you want is to write your robot behaviour code during the last week before the competition.<br><br />
- build a quick sustainable robot in the first week so your coder has something to tinker with. (Try laser cutting, though we didn't do that, it seems quite efficient)<br><br />
- focus on the main aspects of the robot, have it more-less working, only then go for the minute details!<br><br />
- try to not to take another class / full time job alongside maslab... your life will turn into a stream of misery. <br />
- never give up!<br></div>Maslab10https://maslab.mit.edu/2011/wiki/Team_Three/Final_PaperTeam Three/Final Paper2011-01-30T03:38:59Z<p>Cloitre: </p>
<hr />
<div><h3>Overall Strategy</h3><br />
<br><br />
<br />
After analyzing the scoring methods and looking into previous contests, we decided to implement a simple but high risk strategy. Our robot should explore the maze, find and capture balls, then score them over the yellow wall. This means that if for some reason the yellow wall is not found, we have a high probability of losing. As a result, scoring in the goals is considered as a back up plan, which means the mechanical design of the robot should be robust enough for both scoring methods. The robot should indiscriminately pick up balls of both colors, store them until the time of dispatch. <br />
<br />
<br><hr><br><br />
<h3>Mechanical Design and Sensors</h3><br />
<br><br />
<br />
Many interesting mechanical designs were discussed, including catapult, elevator lift, fork lift, four-bar linkage, and spinning wheel. We wanted to make a robot that has not been made before in maslab, simple to construct, and fun to watch. A waterwheel with waterpark slide idea came into being. To further simplify the design, a rotating arm controlled by a servo replaced the waterwheel. <br />
<br />
The final design, sensor placement, and work flow of the robot is as the flowing:<br />
<br />
1) To accommodate all the necessary components, three horizontal layers are built. The bottom layer contains the battery, wheels and motors. The second layer is for the Eee PC. The top layer is used to mount the uOrc board and the slide. A circular front face connects all three layers and is used to capture/guide balls. <br />
<br />
2) The robot, with two-wheel drive in the middle, explores the contest area. Two caster wheels, one in the back and one in the front, provide additional balance. The caster wheels are of different heights to help the robot over come bumps on the carpet.<br />
<br />
3) With a long range IR sensor mounted on the front face of the robot and two short IR sensors mounted diagonally on the sides, the robot can perform functions such as wall following, getting out of a large room with small door, etc.<br />
<br />
4) A belt of bump sensors is mounted on the bottom layer to help the robot avoid walls. <br />
<br />
5) When the robot sees a ball with the camera mounted on its front face, it goes toward it. The break beam sensor near the opening on the front face allows the robot to know when a ball has entered its mouth. The arm is then triggered to scope the ball up, and dump it into the slide. The ball rolls down the slide until it comes to a stop at the exit/drawbridge. <br />
<br />
6) The camera then finds the yellow wall, and the robot drives toward it. As both bump sensors on the front face are triggered, a servo lets the drawbridge down, allowing the balls to roll out due to gravity.<br />
<br />
7) The bumper in the front is made into a mustache, and the exit/drawbridge into a monocle. Finally with the addition of a black top hat, Monsieur Robot is complete.<br />
<br />
<br><hr><br><br />
<h3>Building the robot</h3><br />
<br><br />
<br />
In order to achieve all the objectives we decided in our first brainstorming, we CADed the robot to make sure every components would fit in Monsieur Robot. The software we used was SolidWorks. It allows you to create parts and to assemble them. More importantly, it can produce.dwg files that are compatible with a laser cutter. The solution of laser cutting acrylic sheets were then obvious for its convenience.<br />
<br />
Pros and cons of the acrylic sheet technology:<br />
<br />
Pros:<br />
<br />
<ul><br />
<li> You can create complex shapes, the laser cutter can deal with it and provide a fair tolerance on the dimensions.<br />
<li> It is easy to drill and tap even in the thickness. We used 4-40 screws in a 1/4 of an inch thick sheet.<br />
<li> It's cheap. One sheet that's 36x24 inches costs around 30 dollars.<br />
</ul><br />
<br />
Cons:<br />
<br />
<ul><br />
<li> It is brittle. You need to be careful on the load you apply to it. Particularly when it is applied on a screw fixed in the thickness.<br />
<li> This technology allows you to create easily shapes in 2D but not in 3D. To create the ramp for the storage of the balls, we had to assemble 10 parts.<br />
</ul><br />
<br />
<br />
<br><hr><br><br />
<h3>Software Design</h3><br />
<br><br />
<br />
The brains of Monsieur Robot were developed using Java in 4 weeks. After coming to overall robot design and strategy conclusions, it was time to get started writing the software. In the end, an intelligence was developed that, although not quite self-aware, still managed to maneuver itself around the field. <br />
<br />
From the beginning, we decided to make Monsieur a state-based machine. This seemed the easiest to program and most efficient method for collecting and scoring balls. However, to gain an edge, we knew that transitions between the states had to be very strategic and often exit a state before its conclusion. Come final competition our robot had three distinct states:<br />
<br />
<ul><br />
<li>Exploring: Consists of two sub-states, StraightExplore and SpinExplore. In StraightExplore, the robot moves forward, attempting to keep its original angle, but avoiding walls. In this way, it will go straight, but also wall follow if it comes in contact with one. In SpinExplore, the robot turns 2*Math.PI, using its long-range IR sensors to find the direction with the furthest distance away. It combines this with knowledge of its original direction to make a choice of direction to proceed exploring. These exploring states alternate between each other until the robot finds something of interest. If at any point during these two states an object of interest is found, the state changes to its corresponding action. </li><br />
<li>CollectBall: If the robot is not full of balls and is not only looking for walls (last 20 seconds), then upon seeing a ball the robot will change into the CollectBall state. During this state the robot uses a dual PID system to move towards the ball (angle and distance control). At the point where the ball is too close to be seen, the robot moves forward blindly until the ball triggers its breakbeam sensor. Then the robot actuates its lift arm to store the ball in its ramp hopper. Upon completion the robot returns to exploring state.</li><br />
<li>ScoreWall: If the robot believes it has collected a ball and it sees at yellow wall it enters the ScoreWall state. This state uses dual PID to move towards the center of the yellow wall as seen by the camera. Once the yellow wall has reach a certain height and width in the camera screen (aka it is close and wide enough), the robot charges forward. Using 2 front bump sensors to align itself, the robot then opens its ramp for a hard-coded amount of time, allowing the balls to fall on the opponent's side. It then taunts the opponent for luck.</li><br />
</ul><br />
<br />
Of course, these states only represent high-level behavior. Behind the scenes, PIDController, VisionHandler, and Timer do all of the dirty work. <br />
<br />
<ul><br />
<li>PIDController runs in a separate thread in order to maintain smooth movement alongside behavior code and camera processing. It is activated by requesting a turn(angle) or a straightMove(distance). These can be combined to move in a curve. None of its methods are blocking, but programs can wait until it reaches its angle or distance thresholds by checking an isRunning() method. It uses the system time to calculate integral and derivative functions and implements optional low-level wall following using the IR sensors. It never directly interfaces with the camera -- the behavior code always passes camera coordinates to the PIDController. </li><br />
<li>VisionHandler runs un-threaded, only capturing on-demand. VisionHandler has a getObjects() method that returns all the objects (type, coordinate, and shape info) in a List. Color recognition was implemented with hard-coded HSV ranges. Auto white balance and exposure were disabled for consistent color values. Objects were found using a recursive solid color area function. These were then typed as wall-tops, balls, or yellow walls for the behavior code to use in decision-making. Typing uses shape proportions(height,width), shape area, and density (points/area). Shapes with sufficiently small areas, shapes above the blue wall line, and shapes within goals are filtered out.</li><br />
<li>Timer handles keeping track of the game time and killing the JVM when time is up, bringing the robot to a stop. The behavior code also uses Timer's getTimeRemaining() method to make strategic decisions.</li><br />
</ul><br />
<br />
Many test methods were developed to observe individual actions, object detection, and PID control performance. Code for goal-scoring and barcode detection were partially developed but later abandoned to hone basic functionality. Java's audio package allowed us to taunt our opponent with clips from Monty Python's The Holy Grail, which was paramount in winning the audience's favor.<br />
<br />
<br />
<br />
<br><hr><br><br />
<h3>Suggestions to Future Teams</h3><br />
<br><br />
<br />
<br />
<ul><br />
<li>Mechanical: Design for bumps early on -- they can have drastic effects on your design.</li><br />
<li>Software & Electrical: Make sure fundamental features are adequate before moving on to more complex behaviors. For example, make sure the PID controller works very well before trying to use it to follow walls. Test all fundamental features extensively before moving on to complex behaviors. Spend time to make strong cables and organize wires well -- it will save you more time later on. Read analog sensor outputs and write them down somewhere for later reference. Strategically place your camera and tell your Mech E's the required position -- some heights and angles are better than others. Test in all sorts of lighting conditions -- many things can change lighting during the competition. Make use of Java's audio library, not only for taunting, but also for debugging purposes. It is often easier to understand than LED's if the robot simply says: "exploring" or "collecting ball". </li><br />
</ul></div>Jameswhttps://maslab.mit.edu/2011/wiki/Team_Two/Final_PaperTeam Two/Final Paper2011-01-26T01:28:08Z<p>Lbarnes: /* Electric Issues */</p>
<hr />
<div>[[Image: Putzputzbanner.png]]<br />
<br />
==Overview==<br />
[[File:front.jpg|thumb||right|Little putzputz won 1st place!]]<br />
[[File:finalshot.jpg|thumb||right|100px||CAD model finalized before IAP]]<br />
<br />
Putzputz is the result of 3 weeks of Asian parenting. We pushed hard for her to learn to explore her world and redesign it as she sees fit. Early on, she showed a clear aptitude for fetching balls, and we worked hard to teach her to face all her challenges head on. Aside from her occasional temper tantrums, little Putzputz has grown up so quickly and has made us very proud.<br />
<br />
In 26 days, as a team of 4 undergraduate engineering students, we designed, built, programmed, and relentlessly tested a fully autonomous robot that was capable of robustly finding balls and scoring them over walls. Our strategy was simple: Go fast, score balls if you can, find and pick up balls if you can. Nothing better do? Wander around until you see something of interest. Don't ever get stuck, don't ever jam. Random erratic behavior is better than being stuck. <br />
<br />
Mechanically, our robot had a circular footprint, which helped it maneuver, and a compact design that allowed for a flexible yet robust platform for the sensors and software. Every sensor we selected played an important role in the operation of our robot; nothing was extraneous. Software-wise, our robot was driven almost entirely by vision with layers and layers of behaviors and redundant checks to ensure she continued to run in any situation.<br />
<br />
The following details our mechanical, electrical, and software design choices, along with our testing framework, issues we came across, and our tips for future teams. We also want to give a huge thank you to the MASLAB staff for a wonderful adventure!<br />
<br />
<br />
__TOC__<br />
<br />
==Team Members==<br />
<br />
*Leighton Barnes - Course 18, 2013 - Focused on sensor design and electrical work. Instrumental in debugging robot in all disciplines.<br />
*Cathy Wu - Course 6-2, 2012 - Focused on major software components: vision, testing suite, multi-threading, ball collection and scoring behavior. Managed the team and made sure things got done. <br />
*Stanislav Nikolov - Course 6-2, 2011 - Focused on major software components: overall architecture, wall following, control, and stuck detection.<br />
*Dan Fourie - Course 2, 2012 - Focused on mechanical design. Got things done extremely quickly.<br />
<br />
==Mechanical Design==<br />
<br />
Our robot was designed for robustness and reliability. The robot serves as a reliable platform for the vision and control software systems. As such, it should be sturdy, constructed quickly, have extremely low mechanical failure rates, be able to withstand hours of testing, and be robust to positioning errors. The robot was designed with CAD to account for all components, to ensure optimum packing, and to facilitate fabrication with the laser cutter. Our team was fortunate enough to have 24 hour access to a laser cutter and waterjet, which made rapid assembly and adjustments possible. <br />
The robot's structural members were built primarily from 1/4" acrylic sheet. It utilizes a rubber band roller powered by a DC motor to collect balls and 4-bar linkage hopper actuated by another DC motor to get balls over the wall. DC geared motors drive no-slip wheels. The robot underwent brutal testing and survived severe battering valiantly.<br />
<br />
===Drive System===<br />
<br />
The high level design of the robot's drive system consists of three structural boxes secured together in a line. The boxes are incorporated within a 14" circle to ease navigation. The two outside boxes contain the direct-drive motors, which are mounted to aluminum plates for strength. The central box forms the majority of the rest of the robot's structure and primarily contains the hopper. The three boxes are fastened together with steel brackets (to leverage the powers of the laser cutter and to avoid excessive tapping) and locknuts (to ensure the final assembly did not disassemble).<br />
<br />
Toothed no-slip wheels were chosen to minimize slipping on the playing field carpet. This condition proved effective in increasing the speed of the robot and in stalling the drive motors to provide current feedback for stuck detection. The wheels were not perfectly no-slip however, and did not stall in all cases, which was specified in the initial design to obviate the need for bump sensors. The wheels were cut from 1/8" aluminum plate on an abrasive waterjet machine.<br />
<br />
Steel hubs were precision machined to provide a stiff, reliable coupling between the motor shafts and the wheels. They also allowed the wheels to be placed as close as possible to the motor in order to decrease bending torque on the gear boxes.<br />
<br />
===Electronics Mounting===<br />
<br />
Electronics were mounted to the robot with the goals of rigidity, interchangeability, and adjustability where necessary.<br />
<br />
The EeePC was completely disassembled in order to determine the best way to securely mount it to the robot. It was decided to remove the extraneous monitor and keyboard, but to retain the hard, white motherboard shell to protect the sensitive components. While other teams utilized tape or Velcro, our netbook is bolted to an acrylic plate and shock mounted (with foam padding) to an angled back plate. <br />
<br />
In addition, the Orc Board was secured to its own acrylic plate and provided with a protective cover to ward off balls possibly fired from the other side.<br />
<br />
The webcam was removed from its plastic housing and the PCB was potted in epoxy and attached to an acrylic backing plate. These adjustments saved an enormous amount of space and allowed the camera to be positioned in the ideal location on the robot. The camera angle was also adjustable which proved valuable in eliminating the need for blue line filtering.<br />
<br />
The bump sensor suite covering the front 160deg of the robot was an addition to the initial design. The need for immediate and precise digital feedback about the robot's surroundings was understood after initial testing showed that good obstacle avoidance using IR sensors was difficult to achieve. Each of the five bump sensors are made from a strip of spring steel and a small snap-action switch. The extended levers created by the strips provide a larger area of contact and also protect the switches themselves from damage. In addition to bump detection, the left and right bump sensors aid in aligning with a wall.<br />
<br />
A tiny limit switch is triggered at both the up and down limits of the hopper mechanism to signal the motor to stop.<br />
<br />
===Scoring Mechanism===<br />
<br />
Our scoring mechanism was designed to lift balls from low in the robot to high and well beyond the yellow wall as efficiently and as smoothly as possible. 4-bar synthesis was used to generate a linkage that would move the hopper from a tilted back low position to a tilted forward position over the wall. The leading edge of the hopper extended more than an inch above and three inches beyond the top edge of the wall. This large tolerance in scoring positioning proved invaluable in getting balls over the wall from less than ideal orientations. The all metal parts of the hopper provided durability and compact construction. <br />
<br />
Having arrived at this mechanism, and constrained by footprint and form limitations, the rest of the components fell in place around it. <br />
<br />
The tried and true rubber band roller was used for picking up balls. <br />
<br />
<br />
<!--<br />
<gallery><br />
File:Mech1_s.jpg|stage 1<br />
File:mech2_s.jpg|stage 2<br />
File:mech3_s.jpg|stage 3<br />
</gallery><br />
--><br />
<br />
<br />
[[File:Mech1_s.jpg|thumb|none|574px||stage 1]]<br />
[[File:mech2_s.jpg|thumb|none|574px||stage 2]]<br />
[[File:mech3_s.jpg|thumb|none|574px||stage 3]]<br />
<br />
==Electrical Design and Sensors==<br />
<br />
<br />
===Motor Controllers===<br />
<br />
Because our robot design required four motors (2 drive motors, 1 to pick up balls, and one to score them) and our Orc Board only features three H-bridges, we had to design an additional circuit to control the last motor. The motor that drives the roller in the front to pick up balls only had to go in one direction, so we chose that one as the one that would be driven by this additional controller.<br />
<br />
Our first attempt at this additional controller was just a 40N10 power FET whose gate was driven by the digital out of the Orc Board (with a protection diode accross the motor of course). As we learned with this first attempt, the digital out of the Orc Board is somewhere around 3.7V, instead of the nominal 5V, which could barely overcome the 2-4V gate threshold voltage of the FET (or any other power FET we had on hand). Instead of spending the time to build a gate driver to get around this problem we tried an L298 H-bridge package instead. This worked with the logic-level voltage provided by the Orc Board although we stuck to one directional capability in favor of using the standard four protection diodes instead of one.<br />
<br />
===Batteries===<br />
Throughout the build period Dan and Leighton continued to investigate different battery options and even contructed multiple different kinds of battery packs. The input from previous teams' papers suggested that a high voltage (18V or so) NiCd pack from a cordless power tool was the ideal battery. This type of pack was lighter and had a much higher power density than the standard lead-acid pack. In past years, the increased voltage and power-density such a pack offers would have been a huge plus, but this year we were given adaquately powerful drive motors even when driven at the standard 12V. We found that our NiCd packs ran down much too quickly with their 1.7Ah while the standard lead-acid pack, which was rated at 8.7Ah, could last while testing for hours on end. We also briefly tested a pack of four 3.3V A123 cells which seemed like it could have been the perfect choice. The pack was rated at 2.2Ah, was lightest out of all of them, and dumped as much power as we wanted on demand, but it was a pain to charge. We had access to a charger, but not one that we could take with us anywhere.<br />
<br />
===Sensor Choice===<br />
<br />
'''Bump Sensors'''<br />
<br />
In the end our bot had 5 bump sensors in an arc across its front. We had originally only planned for two in the front to help align while scoring over the wall, but we realized late in the game that bump sensors are an effective tool for dealing with any obstacle. They are free, and there is no reason any bot shouldn't be covered with them.<br />
<br />
'''Break-Beam Sensor'''<br />
<br />
We implemented a break-beam sensor just beneath the roller such that we could detect when we picked up a ball. The sensor was just an IR LED on one side of the bot and a phototransistor in series with a 1Mohm resistor on the other side. We then measured the voltage across the 1Mohm resistor with an analog in on the Orc Board and compared it to some threshold in software. If we had bothered to tune the resistor such that the signal would read approximately 0 or 5V depending on whether the beam was broken or not we could have just as easily used a digital in port.<br />
<br />
'''Encoders'''<br />
<br />
The encoders that MASLAB gives you are unreliable and low resolution. We deliberated for some time on how to replace them. Good, high res optical encoders can easily cost 35$ each which we were unwilling to spend. We ended up using little break-beam packages as geartooth sensors on our wheels. While this theoretically gave us 120 ticks/revolution and the sensor responded quickly and accurately, there were a couple problems. First, there was no quadrature encoding and we were forced to assume that the wheels were going the way that we were commanding them to go. The biggest problem, though, was that while there were many threads running the software didn't sample the signal fast enough to catch every tick. In the end, we didn't really end up using our encoders.<br />
<br />
'''IR Range-Finders'''<br />
<br />
We used 3 long range IR sensors in an arc across the front and one short range IR sensor on each side. The idea was to detect obstacles from far away but still have accurate short range readings for wall-following. The short range sensors were much easier to deal with as there is no noticeable dead zone for short distances and out of range readings can be easily filtered out.<br />
<br />
==Software Design==<br />
<br />
===Overview===<br />
<br />
Our software architecture emphasized simplicity and modularity. For the operation of our robot, we used a simple state machine that was mainly driven by a focus on speed and vision. Within each state, we also performed stuck detection and also additional actions if bump sensors were triggered. <br />
<br />
We wrote classes that abstracted out each and every type of sensor we used and we forked a thread for each type to record and process readings. During a run, there are about 10 threads running. <br />
<br />
On top of abstracting out sensors, we also abstracted out everything else, including images, color statistics, and the on buttons, and had a function for just about everything. This is perhaps excessive, and in the end, we had over 9000 lines of code, but it also came in useful again and again. During our numerous testing sessions, we were able to easily fix most issues because all the functions were already available.<br />
<br />
In addition, instead of trying to predict all kinds of situations our robot could be in, we interspersed our code base with the use of randomness and heuristics. For example, if we don't know whether to turn left or right, we will sometimes randomly generate a direction. If we don't know how much we've turned since the last iteration through a loop, we will make a reasonable guess. <br />
<br />
===State Machine and Robot Behaviors===<br />
<br />
We used a simple state machine design that heavily relied on vision. By default, the robot spins in place scanning the surroundings for balls or scoring walls. Detecting respective objects allows the robot to transition into its ball fetching or scoring behaviors. A timeout into a wall following behavior allows us to roam into new regions to find more objects. All behaviors default to scanning for objects. <br />
<br />
Our behavior for obtaining a ball involves lining up to the ball, getting closer to the ball, and then charging it for a short duration. We charge to make up for the complete lack of information when we are too close to a ball for the camera to be useful. This has worked well for us, since it is fairly accurate and also captures balls quickly. <br />
<br />
Our behavior for scoring involves lining up to a scoring wall, moving towards the scoring wall until the appropriate bump sensors trigger, extending the hopper to dump the balls, and retracting the hopper. We stop the roller when moving the hopper to prevent balls from getting stuck underneath the hopper. The bump sensors sometimes take several tries to trigger properly, and we often have problems with the robot thinking it is no longer at a scoring wall because the camera is so close that it can only see the blue tape on the wall.<br />
<br />
===Time and Ball Count===<br />
<br />
We leveraged some time and ball count information to help robot performance. In the first 30 seconds of a round, our robot does not attempt to score, so that it can collect the easy balls. When the hopper is full of balls, the robot will stop looking for balls and focus instead on scoring. In addition, each ball that the robot obtains allows it to wall follow for more time. The idea is that with fewer balls on the field, the robot should be given more time to explore in order to increase its likelihood of finding new things.<br />
<br />
===Vision===<br />
<br />
At a high level, we forked a thread for the camera that continuously takes an image, processes the image, saves statistics about various parts of the image, occasionally publishes the image to BotClient, and repeats. We worked with images in the HSV color space and focused on speed instead of accuracy or detail. That is, we attempted to process as many images as possible and actually did as little processing on each image as we could. The processing steps we took were down sampling 3x, converting from RGB to HSV, and generating statistics on the various colors in the image. Although we implemented many of the fancier image processing techniques (e.g. smoothing, edge detection, connected component labeling), we decided that the higher quality information was not worth slowing down our image processing. Instead, we focused our vision efforts on preprocessing, multi-threading, and various other performance optimizations. In the end, our camera thread was processing images at between 14 and 31 frames per second, depending on how much color there is in an image. (Disclaimer: The staff claims that this rate may not be accurate.)<br />
<br />
To prevent from converting each image from RGB to HSV color space using the slow conversion algorithm provided by the MASLAB API, we allocated a 256x256x256 array at the start of each run that maps every RGB combination to its HSV value. Each image is then converted to HSV format using this lookup table and cuts the image conversion time roughly in half. The allocation of the array itself takes less than 4 seconds and is created by reading in a serialized form of the array itself from disk. <br />
<br />
Mapping from the HSV color space to a color happens in a separate stage with the use of different HSV thresholds for each color. The two sets of color mappings are separate since the thresholds for each color could be different from day to day. The colors we handled were red, green, yellow, and black. Due to our camera placement and angle, we avoided the need to handle blue.<br />
<br />
To determine the thresholds for each individual color, we wrote a user-friendly color calibration utility that we used to adjust to different lighting situations. We place the robot in front of something of the color (e.g. yellow wall) we want to calibrate, start the utility, select the color (e.g. yellow), wait a few seconds, and check BotClient to see if we like the new calibration. The idea is very simple. After studying some images, we found that hue is resilient to lighting changes; it is saturation and value that change. Therefore, we pre-determined the hue values for each color separately based on a sample of images. The utility then takes an image and does two passes through it. First, it collects all pixels within the hue thresholds. In the second pass, it utilizes connected component labeling and generates statistics based on the largest component of the appropriate color to determine reasonable thresholds for saturation and value. Finally, the utility uses the new thresholds to process additional images so that we can move the robot around and evaluate the calibration. The utility did not handle black, but the thresholds for black were fairly straightforward.<br />
<br />
To reduce the number of pixels to process per image, we down sampled 3x. With fewer pixels to work with and less resilience to noise, down sampling really pushed our color calibration utility to its limits. 3x is probably the limit for our image processing code. If we did more filtering, more down sampling could be feasible. <br />
<br />
Yellow occurs in 2 places on the field: yellow scoring walls and along scoring goals. Both use the same yellow; however, the former is favorable to us, but we want to avoid goals. Our solution is very simple. We observed that the inside of the goal itself appears black, but that there is usually very little black along scoring walls. Thus, we will only approach yellow that does not also have black. One complication is that when the staff decided to re-introduce bar codes into the field, sometimes the black on the bar codes will make a scoring wall look like a goal to the robot. The upside of our solutions include its simplicity and the ability to prevent our robot from approaching goals from far away thinking that they are scoring walls. The downside is that goal detecting sometimes has false positives as mentioned before. Using a ratio of black to yellow or using connected component labeling could make goal detection more robust, but we decided to sacrifice accuracy for less processing.<br />
<br />
To reduce unnecessary processing, we publish an image to BotClient only once per 1.5 seconds. This has several benefits. First, the overhead of publishing to BotClient is decreased. Additionally, we draw over images so that the audience (or the engineer) can tell what the robot sees, but this is actually pretty slow. Having to manipulate images only once in a while allowed us to cut down the average image processing speed by 72 to 109 msec per image.<br />
<br />
===Control===<br />
<br />
We used PID position control for aligning with balls and P position control for wall following. We also used open-loop velocity control and abstracted out directly setting PWMs in two ways. The first abstraction was a drive method that takes a particular direction. The second abstraction was a setVelocity method that takes a forward velocity in m/s and a rotational velocity in rad/s. For the second abstraction, we came up with an approximate piece-wise linear model relating wheel velocity and PWM so that we could deal with velocities in a more user-friendly way.<br />
<br />
We considered doing closed-loop velocity control using wheel encoders, but we found that open loop control was good enough to carry out our behaviors, and that it was tricky to estimate tick rates from somewhat noisy encoders. We experimented instead with scaling the PWMs supplied to each wheel to get the robot to drive slightly straigher, since it would veer off to one side.<br />
<br />
===Wall following===<br />
<br />
====Overview====<br />
We used a proportional controller to stay at a fixed distance from and roughly parallel to the wall. Each side of robot had an IR sensor perpendicular to the wall and an IR sensor at roughly 45 degrees to the wall. If one of the side sensors is in range of a wall, we start wall following. This allowed us to start following walls from far away as well as to stay farther from the wall when wall following, which gave us more opportunities to see balls and walls. We set a desired distance for each of the two IR sensors. The robot moves forward at a constant velocity and each IR in-range comes up with a desired rotational velocity by multiplying a gain with the difference between actual and desired distances. The desired rotational velocities are then averaged and the average is commanded to the motors. We did not explicitly calculate the angle to the wall and try to remain parallel but the distance control implicitly took care of that.<br />
<br />
====IR Sensor Calibration====<br />
We found that the raw IR readings did not correspond to the correct distances. We manually calibrated them and came up with linear models for the short-range and long-range IRs, as well as the applicable ranges where the models are valid. The resulting transformed readings were roughly accurate distances in meters, which was good enough for us to work directly with distances rather than try to guess the corresponding raw IR readings.<br />
<br />
====IR Filtering====<br />
IR sensors have limited effective ranges. We found that short-range IRs start giving garbage readings above roughly 0.25 m, and long-range IRs start giving garbage readings below rougly 0.17 m. Garbage readings for the short-range IR were always either too low for the robot to ever experience them or higher than 0.25 m, so we could safely trust any reading between 0.1 m (the minimum short-range reading we could with the way the short-range IRs were mounted) and 0,25 m.<br />
<br />
Unlike the short-range IR, whose sets of garbage readings and good readings are effectively disjoint, long-range IRs are a bit more tricky. If a long-range IR is less than about 0.17 m from an obstacle, it will start producing readings as high as 0.60 m. Thus it is difficult to tell whether we are too close to an obstacle or actually at 0.60 m. We considered long-range IR readings above 0.45 m out of range (and in the case of wall following, 0.75 m, since we would lose walls too often). Thus, when we were somewhat far from the wall but still capable of wall following, we would get garbage readings from the short-range IR, but our long-range IR would be in range and we would get closer to the wall. If we got too close, we would start getting garbage readings on the long-range IR sensor and good readings on the short-range sensor, which would push us away from the wall.<br />
<br />
===Stuck detection===<br />
<br />
If your robot gets stuck and doesn't do anything about it, you're in trouble. If you have implemented timeouts, eventually your robot will switch states and possibly get unstuck. But what if you are really stuck, and your timeout behavior is inadequate at getting you unstuck? Even if it is adequate, waiting until the timeout to do something wastes precious time when you only have three minutes for a run. We wanted to very quickly detect being stuck (e.g. in a second or less) and to do something drastic to get unstuck, while not being too sensitive and generating false positives. This was a challenge. This section provides an overview of our final method, as well as some details on how we got there.<br />
<br />
==== Motor Current ====<br />
<br />
If the wheels are stalled while being commanded to move, then the current through the motors increases. Based on this principle, it is possible to detect if the robot is stuck. We briefly tried using single sensor readings combined with various threshold schemes (constant, and a step function depending on PWM). However, motor current is noisy and has transient peaks (Figure 1) when the PWMs change abruptly, which happens all the time in a typical run. This resulted in many false positives. Ultimately, we filtered motor current in three ways to detect being stuck:<br />
<br />
*Long sliding window average<br />
*Short sliding window average<br />
*Consecutive timesteps above threshold<br />
<br />
'''Figure 1:''' Motor current for a robot driving forward at PWMs of 0.4, 0.5, 0.6, 0.7, 0.8, 0.9, 1.0 for 1500 ms each. Note the transient spikes.<br />
<br />
[[Image:Rampup.gif]]<br />
<br />
We used the long sliding window to filter out high frequency noise (Figure 3). We then required the current filtered with the long window to be above a threshold for a minimum number of consective timesteps in order to ignore brief peaks and look for sustained peaks, like the one in Figure 2. If those conditions were met, we would check the short window average to see if it was still above threshold. If it is, then we determine that we are stuck. If it isn't, we decide that we've very recently managed to get ourselves unstuck, but the long window still thinks we're stuck due to its longer memory. If we just used the short window, we would be far too sensitive. If we just used the long window, we might have already gotten unstuck by the time we decide that we are. By using the long window statistics as a prerequisite to checking short window statistics, we are more robust to noise on the one hand, and avoid sluggish memory on the other. This way, we are able to decide quickly if we are stuck, and avoid false alarms.<br />
<br />
'''Figure 2:''' Motor current for a robot that drives forward for a bit and then gets stalled.<br />
<br />
[[Image:Stuck.gif]]<br />
<br />
'''Figure 3:''' Motor current (filtered and unfiltered) for a robot that drives forward for a bit and then gets stalled.<br />
<br />
[[Image:Stuck-filtered.gif]]<br />
<br />
====Encoders====<br />
We considered using encoder data to see if we are stuck. We thought this might come in handy if we are stuck at a low velocity and our motor current is not high enough to trigger stuck detection. Our encoder data turned out to be noisy and tricky to deal with, surprisingly because of the software and not the hardware. It turned out that the thread gathering encoder data was not being visited often enough to adequately sample ticks. We ended up not using encoders and instead relying on timeouts for handling situations where we are stuck at low velocities and the motor current is not high enough to indicate being stuck.<br />
<br />
==Testing==<br />
===Testing suite===<br />
<br />
We wrote a light testing suite that consisted of 29 classes that each tested some functionality of the robot (various sensors, color calibration, raising and lowering the hopper). These tests made it both very easy to make sure that all parts of our robot are still actually working and also served as wonderful regression testing tools whenever we made a change to the robot electrically or mechanically. We ran a number of these tests before every competition and they simplified debugging tremendously. They were great tools for the members of our team who did not write code for the robot. Additionally, the tests served as templates for how to use the various classes that were written, which was useful for the other developers.<br />
<br />
===LED debugging===<br />
<br />
We attached LEDs to our robot for color detection: red, green, yellow. If the robot was going for a red ball, the red LED would light up. If the robot was going for a yellow wall, the yellow LED would light up. This was a great way to effortlessly see if our robot was operating in the correct state and if our color calibration was off -- instead of trying to read screen output and watch the robot, we could actually just stalk our robot during test runs.<br />
<br />
In addition, at the start of a run, the yellow LED indicates that the robot is ready to be started. The robot also does a victory light dance every time she scores.<br />
<br />
===Logging===<br />
<br />
We found it useful to log a number of things, especially when debugging stuck detection. We logged motor current and encoder tick values and statistics and plotted them after each run. This was instrumental in understanding how the data looks and using it to detect when we are stuck.<br />
<br />
Towards the beginning, we used an utility to log images taken from the webcam to use for color training purposes. They were instrumental in determining the hue thresholds for the various colors that we used.<br />
<br />
===Live Parameter Loading===<br />
Our testing was made easier by having a config file from which parameters can be loaded in real time while the robot is running. This made it very quick and easy to tweak gains, thresholds and other parameters. We'd come up with the right gains and thresholds for various behaviors in a single run.<br />
<br />
===Mechanical Issues===<br />
<br />
The most serious mechanical issue that arose in testing concerned balls jamming as the hopper rose to score. This jamming was unacceptable because it often prevented reliable scoring for the rest of the run. Fixing the situation could be hacked by limiting the number of balls allowed in the hopper, but restricting the performance of the robot was counterproductive and a more total solution was required. It was found that adjusting the position of the roller motor by adding another gear to the train, as well as shifting the camera mount forward, eliminated jamming, but dropped balls below the hopper, jamming it again on the way down. A flap, coupled to the upward movement of the hopper, was added to restrict the movement of dropped balls to the inside edge of the roller, where they could be picked up again when the hopper descended. This fix increased our ball capacity to eight. <br />
<br />
Along with this fix, the diameter of the roller was increased by a quarter inch, further improving the robot's ball collection capabilities.<br />
<br />
As we were unfortunately not able to provide our EeePC with a static IP address, it was necessary to repeatedly access the PC directly in order to display the dynamic IP address. As our monitor was removed, this had to be accomplished via an external monitor. In its initial position, the PC was mounted too low for easy access to the VGA port. To fix the problem, a new back plate was cut, which raised the PC position while concurrently freeing up space for the heavy SLA battery to move forward.<br />
<br />
The steel pins used to transmit torque on all of our shafts repeatedly fell out. This was a necessary evil because they had to remain temporary in order to facilitate changes and part replacements. They were fully press fit in and epoxied for the final competition.<br />
<br />
One of our drive motors failed on the day of seeding. This was quickly replaced and the robot continued to function properly. Another motor was immediately acquired and installed to forestall the possibility of the other drive motor failing during competition.<br />
<br />
At all times, possible mechanical failure modes were examined and countermeasures were developed to cancel their effects.<br />
<br />
===Electric Issues===<br />
We had several electrical connections break while rewiring because the single-conductor wire from the lab is unnecessarily stiff and torques solder joints. Find some good stranded wire from a different source. This will allow you to route wires compactly without risk of breaking connections.<br />
<br />
The Orc Board can only sample at 400hz for analog in and 1khz for the fast digital in. Keep this in mind while attempting to implement high-res encoders or any other sensor that responds on the millisecond time scale.<br />
<br />
We needed a 5V source to power our fourth motor. PWMing 12V off of the battery could have been risky for the 3V motor (we didn't have a data sheet). It turns out the 5V rail that goes to the I/O is directly from the buck converter and can source a couple amps. This can be used to power lower voltage motors.<br />
<br />
===Software Issues===<br />
<br />
====Multi-threading====<br />
<br />
First, we would sometimes read unfinished statistics (mid-computation) from the vision thread. Second, after we fixed that, our PID controller would always overshoot with the data from the vision thread. Debugging multi-threading is tricky, and Cathy spent a good few days devoted to tracking down concurrent programming issues. The problem arose from the fact that the vision thread is slow relative to the main state machine thread. It is difficult to generate nearly continuous commands to the robot from discontinuous and discrete vision data. <br />
<br />
We used a few techniques to combat these issues. First, always store and compute vision statistics with different variables. That is, do not overwrite the area variable until a new area has been computed completely; otherwise, the state machine thread that pings the vision thread for this data will almost always read something in mid-computation (read: wrong). The trade-off is that the data that is read will almost always be old, but never by more than 70 msec. Second, to combat the fact that the vision thread is going to appear very discrete (read: slow) to the state machine thread when generating output from its PID controllers, we realized that accumulating error (the I term) and using the same derivative term for 70 msec (the D term) caused our robot to overshoot a lot. Thus, we only updated the I term when pinging the vision thread gave us a new statistics. Additionally, we figured that the derivative term would decrease over time if our robot was doing the right thing, so we applied exponential back-off and decayed the D term per iteration through the state machine until a new statistic was generated. <br />
<br />
====Ball capacity and jamming====<br />
<br />
Ball capacity was the last problem we dealt with as a team. On the software side, we dealt with jamming by capping the ball capacity of the robot at 8 balls. At this point, the robot will stop the roller and look exclusively for scoring walls. The hopper itself had a capacity of 5 or 6, so often, a few extra balls will stick against the roller while the hopper goes up and down to score. We also coded the break beam sensor that kept track of balls entering the hopper to accumulate on the off transition (when the ball stops breaking the beam, as opposed to when it starts breaking the beam). This was not entirely reliable, but enabled our robot to sometimes score twice in a row because, immediately after scoring, the extra balls would be pushed into the hopper. <br />
<br />
====Wireless====<br />
<br />
Not exactly a software issue, but the wireless situation in the 6.01 lab was terrible. We found it impossible to test reliably there (apparently even more so than the other teams), so we opted to test on our hall instead. The downside to this was that we did not have the real field pieces to work with during most of our testing time. The upside is that, since we didn't know what we would be dealing with on the actual field, our code was pretty robust in the end. We patched this issue by setting up test fields in 26-100 for a day before the seeding tournament and by testing rigorously during the mock competitions. <br />
<br />
====Wall Following Issues====<br />
<br />
Our biggest problems with wall following were cause by leaky abstractions. Leaky abstractions are abstractions that make assumptions about a situation and ignore unecessary details, only to fail when the assumptions are incorrect and the details actually matter. <br />
<br />
We had a function that determined whether a direction was free or blocked, and used two sets of thresholds, one for short range and one for long range IRs. We used that to build other functions, such as whether we are following a wall, and on which side. Unilaterally applying the same notions of "free" and "blocked", with the same exact thresholds, for all behaviors was a mistake and led to headaches. <br />
<br />
For example, to determine if we were following a wall, we checked to see if either one of each pair of side IRs was "blocked". If we got too far away from the wall such that both IRs on a side are "free" then we would detect that we lost the wall and try to turn in an arc toward it in order to go around what we perceived as a corner. However, the thresholds for determining "blocked" and "free" were set with obstacle avoidance in mind --- not wall following --- and were relatively low. We would often lose the wall not because the wall ended but because we got too far away. This would trigger the corner-rounding behavior, which since we are far from the wall, would make the robot keep driving in a circle until it timed out. It took surprisingy long to find this since we didn't question these low level abstractions.<br />
<br />
==Performance==<br />
<br />
* Day 8: '''First place''' in the first mock competition with 4 points. Runner ups had 1 point. Our robot spins around, aligns to balls, and charges at them. <br />
* Day 12: '''First place''' in the second mock competition with 21 points. Runner ups had 6 points. Our robot can now wall follow and score over walls.<br />
* Day 17: '''Second place''' in the third mock competition with 46 points. We lost to Team 3 with 49 points. Our robot now sees better, moves around better, and gets unstuck sometimes.<br />
* Day 23: '''Seeded first''' in the seeding tournament with 75 points. Runner ups had 23 points. Our robot does not jam, almost never gets stuck, and does some smart things. We basically code-freezed at this point.<br />
* Day 26: '''Won''' the semifinals in the main tournament with 138 vs 54 points. '''Won''' the finals in the main tournament with 106 vs 56 points.<br />
<br />
==Suggestions==<br />
*Form a team early and commit to doing MASLAB for all of IAP. We formed our team before the start of the school year. <br />
<br />
*Have a well balanced team. It's important to cover all grounds with software, mechanical, and electrical. Our 2 software + 1 mechanical + 1 electrical combination balanced us very well.<br />
<br />
*Work really really hard and stay motivated. We pulled endless all-nighters and never gave up. We continued to pester the staff mailing list with questions and even took a day to set up some legit practice fields in 26-100 and test before the seeding tournament.<br />
<br />
*Start before IAP and aim to have most of everything done in the first 2 weeks of IAP. Because we did most of the design before IAP, we managed to have a functional robot (not the pegbot) by the first mock competition, which helped us out greatly. We also were able to spend the last week and a half making fixes for various edge cases and had time to just polish up things. <br />
<br />
*Don't focus too much on the pegbot and the checkpoints. We had at most 1 or 2 people deal with each of the checkpoints, so the rest of the team could focus on machining the actual robot or designing the software framework. Our pegbot was scrapped in less than a week.<br />
<br />
*Test often and relentlessly. You'll find something wrong with your robot every time.<br />
<br />
*Redundancy is the name of the game. It is difficult to anticipate all the possible ways your robot can mess up. We had triple (or more) layers of failsafe behavior for some situations. For example, if we hit an obstacle we would rely on bump sensors, motor current peaks, and timing out to detect if we're stuck and escape (we had planned to have a fourth layer using encoder data, but it was tricky to get the right thresholds and didn't pan out in the time we had).<br />
<br />
*Beware of the leaky abstraction. Abstractions come about by making assumptions so that you can ignore unnecessary details. Having never been a robot, it is truly difficult to make assumptions about how the robot experiences the world with its sensors. Think carefully about specific situations and come up with tailor-made constants and behaviors. Avoid unilaterally using notions like "near" and "far" or "free" and "blocked" for example --- it really depends on the behavior what "near" and "far" mean. See the section on wall following.<br />
<br />
*Do not neglect mechanical design. Robots are crippled every year in the final competition because something breaks, not because their behavior is poor. Software can recover; physically broken things cannot. Do not use cardboard, do not use glue, do not use velcro or tape. Try not to use zipties. Be precise. Bash your robot into walls excessively. Fix anything that breaks with double strength.<br />
<br />
*On that note, do not neglect software design either. Many of our software fixes were trivial because of the infrastructure and abstractions we had set up. Different behavior needed? Define a state for it, and specify the state transitions. Different sensor variants? Plug 'em in, the main application doesn't care. <br />
<br />
*Testing will take up the vast majority of your time. Set up tools to make effective use of that time. We had a utility for loading parameters from a file in real time during a test run, and were able to iterate extremely quickly.<br />
<br />
==Photos==<br />
<br />
[[File:left.jpg|left]]<br />
[[File:right.jpg|right side]]<br />
[[File:back.jpg|back]]<br />
[[File:top.jpg|top]]<br />
[[File:IMG_1452s.jpg|testing on Putz]]<br />
[[File:IMG_1479s.jpg|testing in 6.01 lab]]<br />
[[File:mIMG_1516s.jpg|late night testing in 26-100]]<br />
<br />
==Video==<br />
<br />
[http://www.youtube.com/watch?v=1kvLC3O37OM| Final - Third Run (Final)]<br />
<br />
[http://www.youtube.com/watch?v=080yuAgGR7o| Final - Third Run (Final), Green Side]<br />
<br />
[http://www.youtube.com/watch?v=rry8o8ZYR9o| Final - Second Run (Winner's Bracket Final)]<br />
<br />
[http://www.youtube.com/watch?v=Wdug2pDay0M| Final - Second Run (Winner's Bracket Final), Green Side]<br />
<br />
[http://www.youtube.com/watch?v=bM4al-Xn8XQ| Final - Second Run (Winner's Bracket Final), Red Side]<br />
<br />
[http://www.youtube.com/watch?v=Y7vOot70TOc| Final - First Run, Green Side]<br />
<br />
[http://www.youtube.com/watch?v=5mp4Zuqz7n4| Maslab 2011 Teaser Trailer]<br />
<br />
[http://www.youtube.com/watch?v=RxiwAiYOfsk| Mock 3]<br />
<br />
[http://www.youtube.com/watch?v=plR4XmozLJo| Mock 2]<br />
<br />
[http://www.youtube.com/watch?v=yarssg8vlDA| Mock 1]</div>Dfouriehttps://maslab.mit.edu/2011/wiki/Team_Six/Final_PaperTeam Six/Final Paper2011-01-24T23:22:13Z<p>Emolague: /* Software Design */</p>
<hr />
<div>__TOC__<br />
<br />
==Overall Strategy==<br />
<br />
Our initial overall strategy was as follows:<br />
<br />
'''Robot algorithm''': The algorithm runs as long as the timer is noted less than 3 minutes (180 seconds). The first ball seen will have its color noted and be saved in a variable called our_color. As soon as this is established, search the map for goals, the goals are determined as follows: if a yellow wall is seen, drive up to it and use the ir sensor to determine whether or not the depth of the wall varies along its length. If the wall does vary, save the location of the goal in a list called goals_loc[]. When the number of goals is greater than 2 (or if more than 30 seconds have elapsed), then begin to look for balls. Whenever a ball is found, look for the nearest goal and transfer the ball to that goal. Do this as long as the timer has not gone over 2 minutes. After two minutes, whenever a ball is found, if the distance between the ball and the nearest known wall to the opponents side is less than the distance to the nearest known goal, then save the distance and calculate a random number between 0 and 1 and save it to rand_n. Should rand_n > e ^ -(d_togoal-d_towall), throw the ball over the wall, else, throw the ball into the goal. Stop after 3 minutes.<br />
<br />
'''Robot strategy''': We decided that we are stronger on the course 6 side of the spectrum than the course 2 side of the spectrum. We're going to keep our robot design relatively simple, with a conveyor belt and accompanying pinball-machine-like doors to pull balls into the robot and drop them into a compartment. On the side, we will have a door that opens when told so that we can drop all our balls into the goal. We decided not to drop balls onto the other side in order to keep our robot simple so that we can focus on our code.<br />
<br />
'''Time-allotment strategy''': <br />
* Michael - Since Maslab is his main IAP commitment, he'll code in the evenings and possibly during the day.<br />
* Shawn - Work until 4pm every day, code at night with Michael.<br />
* Piper - Work on building between lecture (or late mornings when lecture isn't happening) and 7pm daily.<br />
* Xavier - Work on building between lecture (or late mornings when lecture isn't happening) and 7pm daily. <br />
<br />
<br />
==Mechanical Design and Sensors==<br />
<br />
We came to realize that our original robot design was problematic. The conveyor belt roller would have to be very small so that the ball would be pulled up it rather than pulled away, and the doors pushing the ball onto the roller would have to be well synchronized. We decided to start from scratch, and came up with something we believe will work much better. Our new design was not only easier to build and more predictably function, but it also allowed us to score over the wall.<br />
<br />
We have a scoop with a slanted arm leading down, lined with teeth to scoop up a ball when the robot drives into the ball. Two motors will raise the scoop so that the ball rolls back into our collection box. We had some issues with not having enough torque before, so we added long metal bars sticking out from our scoop to provide us with the leverage needed to raise it. Under our collection box, we'll keep our motor, orc board, and computer. The collection box itself will be slanted towards an escape hatch in the back. This hatch was initially designed to be a drawbridge, but this design became harder when we decided to use a servo to open and close it, so it is just a slate of metal. Since the escape hatch is over the wall, our balls will be able to fall into the other team's field. We will do this with all our balls instead of scoring in goals. <br />
<br />
We have two bump sensors at the back of our robot (that is, where the escape hatch is, not the side with the scoop and teeth). This way, when our robot backs up, it will be able to detect when it hits the wall. Our camera faces front, through our scoop (which is composed of plexiglass). On each side, we have two IR sensors that are used to detect the robot's angle to the wall.<br />
<br />
==Software Design==<br />
In general, our software was fairly simplistic. The software consisted primarily of two threads, one carrying out the process of image-processing, and the other carrying out the actual motion of the robot. Apart from both threads was a class (public class Commander) which contained code to actually execute the movements of the robot. That is to say, Commander contained the lowest level code on how the robot moves. <br />
<br />
The image processing thread worked with the class ImageScanner. As the name implies, ImageScanner scans the image and makes available to the other thread the positions of various points of interests. The method ImageScanner.analyze() is at the heart of the class and contains four different submethods that all work in the same way. These methods find_red_blob() , find_blue_blob() etc identify the regions of the image that contain the respective colors. When the regions are found, their center of mass and extremities are recorded to be made available to the other thread. In the event that we wanted to see the output of the method in a visual sense, we can uncomment the name method ImageScanner.proc() and then forward the result to a BotClient.<br />
<br />
The finite state machine that comprises the main navigation thread is fairly simplistic. It starts off in a state called "SCAN" and essentially takes several scans of the course in front of it. On the occasion that it might see a red ball, it changes to the state "FOLLOW__FOOBALL" (FOO being either RED or GREEN). In this state, it uses a simplistic proportional controller to try to zero in on the ball. As such whenever the ball is on the left, the robot moves left slightly and the same on the right. Upon having the ball centered, the robot then charges forward at full speed and attempts to pick it up using the state "PICKUP_BALL". The state involves two smaller threads originating in Commander that allow the robot to move backwards slightly while lifting the scoop as well. The actual scoring mechanism was never really implemented, but similar methods to "SCAN" and "FOLLOW_REDBALL" were hypothesized for the search and approach to goals.<br />
<br />
==Overall Performance==<br />
<br />
Our robot's final performance was non-ideal (we were one of the first groups knocked out). We ended up having scoop/code problems that didn't pop up until the night before the competition, and we weren't able to fix it in time. Overall, we had a physical robot prepared, but it was too late to get our code working with it.<br />
<br />
==Conclusions==<br />
<br />
Our team came in with the purpose of learning more about building and coding, having lot of fun and sleep deprivation doing it, and not worrying about being the most competitive team. For the most part, this worked out really well (other than a couple of panicked moments where we were behind on a checkpoint or two and wondered if we should drop out - the staff helped get us up to speed and this ended up not being a problem). Overall, we walked away with what we wanted to get out of MASlab, which is really awesome.<br />
<br />
We ended up deviating from our original scheduling plan quite a bit. Instead of having our strongest coders (Michael and Shawn) code and our builders (Wings and Xavier) build, we had a fuzzy division of labor that arose naturally. After discussing designs, Xavier would go build, and Wings would maintain the journal and keep a to-do list for our robot. Wings and Shawn would also solder and construct other parts (ie, at the Edgerton Center). Michael worked on the code. With both the code and the building, we constantly sanity-checked each other and proposed alternative ideas, coming to a group agreement every step of the way. <br />
<br />
However, we ended up not being effective at this with our code. Michael spent a lot of time coding, but not being able to build the robot fast enough and not being absolutely clear on how the code and robot should interact meant that we ran into a lot of problems when we were trying to tie everything together. With more time, we would've been able to debug enough to have our robot work as intended. In the end, however, we weren't able to do this, and defaulted to the plan of just having the robot drive around and scoop up balls instead of using our scoring mechanism.<br />
<br />
But as stated before, we are very happy with what we got out of MASlab. We learned to build, code, and work as a team. No doubt we'll carry on the awesome lessons we learned here :)<br />
<br />
==Suggestions for future teams==<br />
<br />
When forming a team, make sure that everyone is on the same page as far as what they want to get out of MASlab, how competitive they want to be, and what their time commitments are expected to be. Also, there's absolutely no harm in thinking about ideas over winter break! It certainly means building can start earlier. Which brings us to...<br />
<br />
Build early, build often, and allot more time for this than you think you need! Our main problem was that we were all very new to building, and things tended to go wrong often. You never know when the laser cutter will stop working, you'll probably have to remachine parts several times over, you will find ''tons'' wrong with your robot that you didn't even think about. Building early means that you can make more mistakes, and don't have to be afraid of them. Having a physical robot is ''very'' useful to debugging, so having that early for your coders is incredibly useful.</div>Wingshttps://maslab.mit.edu/2011/wiki/Team_Seven/AssignmentsTeam Seven/Assignments2011-01-18T17:43:26Z<p>Rafacb: /* Image Processing Sample Pictures */</p>
<hr />
<div><div style="text-align: center;"><br />
<br />
<div style="text-align: center;"><br />
<br />
<div style="text-align: center;"><br />
<br />
== Image Processing Sample Pictures ==<br />
[[File:capture3.png]] [[File:capture3result.jpg]]<br />
<br />
[[File:capture9.png]] [[File:capture9result.jpg]]<br />
<br />
These two pictures were used for early testing as they were not actual photos taken by our camera in 26-100.<br />
<br />
[[File:capture11.jpg]]<br />
[[File:capture11result.jpg]]<br />
<br />
This is the result of an actual photo taken by the team.<br />
<br />
</div><br />
<br />
<!-- Centralize page: <div style="text-align: center;"> --></div>Rafacbhttps://maslab.mit.edu/2011/wiki/Team_ThreeTeam Three2011-01-11T00:14:05Z<p>Cookies: </p>
<hr />
<div>Current Team Name: We-Ski!<br />
<br />
Current Robot Name: Monsieur Robot<br />
<br />
== Team and Our RoBot Naming Options==<br />
<br />
Still thinking about what to call ourselves. It should match the team spirit, the principles of the game and personality of the robot.<br />
<br />
Right now, the ideas include <br />
* Screwed (so we are called we are screwed)<br />
* We've Got Balls<br />
* 必胜 (Chinese, pronounced bisheng, means definitely wins)<br />
* neeu (actually the word new, but conceived from the last letter of our last names)<br />
* 一生懸命 (Japanese, pronounced isshoukenmei, means with all one's might)<br />
<br />
<br />
Our little RoBot puppy needs an awesome name too! So far we have:<br />
* Solene (French, OTZ to Audren, and it would work if we have a solenoid on board)<br />
* Nuts (screws and nuts, anyone?)<br />
* RoBot<br />
* Sprite<br />
* Yum! (don't forget to click your tongue when you get to the exclamation point)<br />
* Hovercraft Wanna Be<br />
* The Clock<br />
* One Up (the little green mushroom shaped robot, gives you one point when you eat it in Super Mario)<br />
* Onion With Layers<br />
<br />
== More Information About Our Robot ==<br />
[[http://maslab.mit.edu/2011/wiki/Team_Three/Journal Journal]]<br />
<br />
[[http://maslab.mit.edu/2011/wiki/Team_Three/Assignments Assignments]]<br />
<br />
[[http://maslab.mit.edu/2011/wiki/Team_Three/Final_Paper Final Paper]]</div>StInfinitihttps://maslab.mit.edu/2011/wiki/Team_TwoTeam Two2011-01-10T06:59:56Z<p>Dfourie: /* PUTZ PUTZ */</p>
<hr />
<div><br />
<br />
<br />
=== PUTZ PUTZ ===<br />
<br />
Journal: [http://maslab.mit.edu/2011/wiki/Team_Two/Journal]<br />
<br />
<br />
Final Paper: [http://maslab.mit.edu/2011/wiki/Team_Two/Final_Paper]</div>Dfouriehttps://maslab.mit.edu/2011/wiki/Team_ElevenTeam Eleven2011-01-09T23:39:29Z<p>Kiarash: </p>
<hr />
<div>[[File:How-to-draw-a-smiley-nerd.jpg]]<br />
<br />
<br />
Team members:<br />
<br />
<br />
Kiarash Adl, William Souillard-Mandar, Tim Robertson, Kristen Anderson</div>Kiarashhttps://maslab.mit.edu/2011/wiki/Team_Eleven/JournalTeam Eleven/Journal2011-01-09T21:46:50Z<p>Kranders: </p>
<hr />
<div>Day 1, Jan 3<br />
<br />
*team gets to know each other better <br />
*thinking about the idea<br />
<br />
Day 2, Jan 4<br />
<br />
*Mechanical design finalized<br />
*Some ideas for AI<br />
<br />
Day 3, Jan 5<br />
<br />
*Robot moves <br />
*working on the design and the code<br />
<br />
Day 4, Jan 6<br />
<br />
*Vision code works<br />
<br />
Day 5, Jan 7<br />
<br />
Day 6, Jan 8<br />
<br />
Day 7, Jan 9<br />
<br />
*Group meeting present: William, Kristen, Kiarash, Gil <br />
*New software architecture<br />
<br />
Jan 13,14,15: Stupid Laser cutter</div>Kiarashhttps://maslab.mit.edu/2011/wiki/Team_Three/AssignmentsTeam Three/Assignments2011-01-09T00:07:16Z<p>Cookies: </p>
<hr />
<div>== Maslab RoBot Build Schedule ==<br />
<br />
<table style="background:#D9FADD" border="1" cellpadding="2" cellspacing="0"><br />
<br />
<tr valign="top"><br />
<th width="100px" style="background:#93FAA0">Sunday</th><br />
<th width="135px" style="background:#93FAA0">Monday</th><br />
<th width="135px" style="background:#93FAA0">Tuesday</th><br />
<th width="135px" style="background:#93FAA0">Wednesday</th><br />
<th width="135px" style="background:#93FAA0">Thursday</th><br />
<th width="135px" style="background:#93FAA0">Friday</th><br />
<th width="100px" style="background:#93FAA0">Saturday</th><br />
</tr><br />
<br />
<tr valign="top"><br />
<td><b>1/2</b><br />
<table border="1"> <tr> <td><p> Welcome to IAP@MIT </p> </td> </tr> <br />
<tr> <td><p> Team 3 Presents: </p><br />
<p> Audren Cloitre</p> <p>Stephanie Lin</p> <p>Faye Wu</p> <p>James White</p> </td> </tr> </table><br />
</td><br />
<br />
<td><b>1/3</b><br />
<table style="background:#FE9B96" border="1"> <tr> <td style="background:#FEDFDD"><p> Checkpoint 1 </p> </td> </tr> </table><br />
<table style="background:#93EEF7" border="1"> <tr> <td style="background:#D7F4F7"><p> Build Pegbot</p> <p>Brainstorm Strategy and Robot Functionality</p> </td> </tr> </table><br />
<table style="background:#D8FD95" border="1"> <tr> <td style="background:#F1FDDC"><p> uOrcBoard Intro</p> </td> </tr> </table><br />
</td><br />
<br />
<td><b>1/4</b><br />
<table style="background:#FE9B96" border="1"> <tr> <td style="background:#FEDFDD"><p> Checkpoint 2 </p> </td> </tr> </table><br />
<table style="background:#93EEF7" border="1"> <tr> <td style="background:#D7F4F7"><p>Decide on Strategy and Robot Design</p> <p>CAD Day 1 of 5</p></td> </tr> </table><br />
<table style="background:#D8FD95" border="1"> <tr> <td style="background:#F1FDDC"><p> Bump Sensor, Encoder and Robot Reaction to Feedback </p> </td> </tr> </table><br />
</td><br />
<br />
<td><b>1/5</b><br />
<table style="background:#FE9B96" border="1"> <tr> <td style="background:#FEDFDD"><p> Checkpoint 3 </p> </td> </tr> </table><br />
<table style="background:#93EEF7" border="1"> <tr> <td style="background:#D7F4F7"><p> CAD Day 2 of 5 </p> </td> </tr> </table><br />
<table style="background:#D8FD95" border="1"> <tr> <td style="background:#F1FDDC"><p> Camera and Camera Vision </p> <p>Optimize Encoder </p> </td> </tr> </table><br />
<table border="1"> <tr> <td><p>Clean Lab@10pm</p> </td> </tr> </table><br />
</td><br />
<br />
<td><b>1/6</b><br />
<table style="background:#FE9B96" border="1"> <tr> <td style="background:#FEDFDD"><p> Checkpoint 4 </p> </td> </tr> </table><br />
<table style="background:#93EEF7" border="1"> <tr> <td style="background:#D7F4F7"><p> CAD Day 3 of 5 </p> </td> </tr> </table><br />
<table style="background:#D8FD95" border="1"> <tr> <td style="background:#F1FDDC"><p> Encoders, Gyro and PID Controller </p> </td> </tr> </table><br />
</td><br />
<br />
<td><b>1/7</b><br />
<table style="background:#FE9B96" border="1"> <tr> <td style="background:#FEDFDD"><p> Checkpoint 5 </p> </td> </tr> </table><br />
<table style="background:#93EEF7" border="1"> <tr> <td style="background:#D7F4F7"><p> CAD Day 4 of 5 </p> </td> </tr> </table><br />
<table style="background:#D8FD95" border="1"> <tr> <td style="background:#F1FDDC"><p> Main FSM and Structure of Code </p> <p>More PID</p> </td> </tr> </table><br />
</td><br />
<br />
<td><b>1/8</b><br />
<table style="background:#93EEF7" border="1"> <tr> <td style="background:#D7F4F7"><p> CAD Day 5 of 5 </p> </td> </tr> </table><br />
<table style="background:#D8FD95" border="1"> <tr> <td style="background:#F1FDDC"><p> Behavior 1 & 2 </p> <p>More PID</p></td> </tr> </table><br />
</td><br />
</tr><br />
<br />
<!--WEEK2--><br />
<tr valign="top"><br />
<td><b>1/9</b><br />
<table style="background:#FE9B96" border="1"> <tr> <td style="background:#FEDFDD"><p> CAD Model Complete </p> </td> </tr> </table><br />
<table style="background:#D8FD95" border="1"> <tr> <td style="background:#F1FDDC"><p> Behaviors 3 & 4 </p> </td> </tr> </table><br />
</td><br />
<br />
<td><b>1/10</b><br />
<table style="background:#FE9B96" border="1"> <tr> <td style="background:#FEDFDD"><p> Checkpoint 6 </p> <p> Mock Competition 1</p> </td> </tr> </table><br />
<table style="background:#93EEF7" border="1"> <tr> <td style="background:#D7F4F7"><p> Machining Day 1 of 3</p> </td> </tr> </table><br />
<table style="background:#D8FD95" border="1"> <tr> <td style="background:#F1FDDC"><p> Behaviors 5 & 6 </p> <p>Vision Code Improvement</p></td> </tr> </table><br />
</td><br />
<br />
<td><b>1/11</b><br />
<table style="background:#93EEF7" border="1"> <tr> <td style="background:#D7F4F7"><p>Machining Day 2 of 3</p> </td> </tr> </table><br />
<table style="background:#D8FD95" border="1"> <tr> <td style="background:#F1FDDC"><p> Behavior 7 & 8</p> <p>Vision Code Improvement</p></td> </tr> </table><br />
</td><br />
<br />
<br />
<td><b>1/12</b><br />
<table style="background:#93EEF7" border="1"> <tr> <td style="background:#D7F4F7"><p> Machining Day 3 of 3</p> </td> </tr> </table><br />
<table style="background:#D8FD95" border="1"> <tr> <td style="background:#F1FDDC"><p> Navigation Done</p> <p>Vision Code Improvement</p></td> </tr> </table><br />
</td><br />
<br />
<td><b>1/13</b><br />
<table style="background:#FE9B96" border="1"> <tr> <td style="background:#FEDFDD"><p> Final Robot Frame Built </p> </td> </tr> </table><br />
<table style="background:#D8FD95" border="1"> <tr> <td style="background:#F1FDDC"><p> Navigation Improvement</p> <p>Vision Code Improvement</p></td> </tr> </table><br />
</td><br />
<br />
<td><b>1/14</b><br />
<table style="background:#FE9B96" border="1"> <tr> <td style="background:#FEDFDD"><p> Checkpoint 7 </p> <p> Mock Competition 2</p></td> </tr> </table><br />
</td><br />
<br />
<td><b>1/15</b><br />
<table border="1"> <tr> <td><p> MIT Mystery Hunt </p></td> </tr> </table><br />
</td><br />
</tr><br />
<br />
<!--WEEK3--><br />
<tr valign="top"><br />
<td><b>1/16</b><br />
<table border="1"> <tr> <td><p> MIT Mystery Hunt </p></td> </tr> </table><br />
<table style="background:#FE9B96" border="1"> <tr> <td style="background:#FEDFDD"><p> Mechanical Improvement </p> </td> </tr> </table><br />
<table style="background:#D8FD95" border="1"> <tr> <td style="background:#F1FDDC"><p> Behavior Improvement</p><p>Sensor Calibration</p></td> </tr> </table><br />
</td><br />
<br />
<td><b>1/17</b><br />
<table style="background:#FE9B96" border="1"> <tr> <td style="background:#FEDFDD"><p> Mechanical Improvement </p> </td> </tr> </table><br />
<table style="background:#D8FD95" border="1"> <tr> <td style="background:#F1FDDC"><p> Behavior Improvement</p><p>Sensor Calibration</p></td> </tr> </table><br />
</td><br />
<br />
<td><b>1/18</b><br />
<table style="background:#FE9B96" border="1"> <tr> <td style="background:#FEDFDD"><p> Mechanical Improvement </p> </td> </tr> </table><br />
<table style="background:#D8FD95" border="1"> <tr> <td style="background:#F1FDDC"><p> Behavior Improvement</p><p>Sensor Calibration</p></td> </tr> </table><br />
</td><br />
<br />
<td><b>1/19</b><br />
<table style="background:#FE9B96" border="1"> <tr> <td style="background:#FEDFDD"><p> Checkpoint 8 </p> <p> Mock Competition 3</p></td> </tr> </table><br />
<table border="1"> <tr> <td><p>Sponsor Dinner</p> </td> </tr> </table><br />
<table border="1"> <tr> <td><p>CleanLab@10pm</p> </td> </tr> </table><br />
</td><br />
<br />
<td><b>1/20</b><br />
<table style="background:#D8FD95" border="1"> <tr> <td style="background:#F1FDDC"><p> Test and Adapt</p></td> </tr> </table><br />
</td><br />
<br />
<td><b>1/21</b><br />
<table style="background:#D8FD95" border="1"> <tr> <td style="background:#F1FDDC"><p> Test and Adapt</p></td> </tr> </table><br />
</td><br />
<br />
<td><b>1/22</b><br />
<table border="1"> <tr> <td><p> GSC Ski Trip </p></td> </tr> </table><br />
</td><br />
</tr><br />
<br />
<!--WEEK4--><br />
<tr valign="top"><br />
<td><b>1/23</b><br />
<table border="1"> <tr> <td><p> GSC Ski Trip </p></td> </tr> </table><br />
</td><br />
<br />
<td><b>1/24</b><br />
<table border="1"> <tr> <td><p> GSC Ski Trip </p></td> </tr> </table><br />
</td><br />
<br />
<td><b>1/25</b><br />
<table style="background:#FE9B96" border="1"> <tr> <td style="background:#FEDFDD"><p>Seeding</p></td> </tr> </table><br />
</td><br />
<br />
<td><b>1/26</b><br />
<table style="background:#D8FD95" border="1"> <tr> <td style="background:#F1FDDC"><p> Test and Adapt</p></td> </tr> </table><br />
</td><br />
<br />
<td><b>1/27</b><br />
<table style="background:#D8FD95" border="1"> <tr> <td style="background:#F1FDDC"><p> Test and Adapt</p><p> Final Code check</p></td> </tr> </table><br />
<table style="background:#FE9B96" border="1"> <tr> <td style="background:#FEDFDD"><p>Robot Impound at 5</p></td> </tr> </table><br />
</td><br />
<br />
<td><b>1/28</b><br />
<table style="background:#FE9B96" border="1"> <tr> <td style="background:#FEDFDD"><p> Final Competition </p></td> </tr> </table><br />
<table border="1"> <tr> <td><p>Competition Tear Down</p> </td> </tr> </table><br />
</td><br />
<br />
<td><b>1/29</b><br />
<table border="1"> <tr> <td><p>Clean Lab</p> </td> </tr> </table><br />
</td><br />
</tr><br />
<br />
</table><br />
<br />
== Maslab RoBot Hardware Design ==<br />
<br />
=== Handdrawn Design ===<br />
<br />
=== CAD Design (preliminary) ===<br />
<table> <br />
<tr> <td> [[Image:Isometric.PNG|x250px|alt="CAD Drawing Isometric"]] </td> <br />
<td> [[Image:Front.PNG|x250px|alt="CAD Drawing Front"]] </td> <br />
<td> [[Image:Left.PNG|x250px|alt="CAD Drawing Profile"]] </td> </tr> <br />
<br />
<tr align="center"> <td> Isometric View </td> <td> Front View </td> <td> Profile View </td> </tr><br />
</table><br />
=== CAD Design (final) ===<br />
[[Image:Final CAD.JPG|alt="CAD Drawing Final Isometric"]] <br />
=== Photographs ===<br />
<br />
== Maslab RoBot Software Architecture ==<br />
<br />
== Maslab RoBot Strategy ==<br />
<br />
<table cellpadding="5" cellspacing="5" rules="none" border="1"> <tr> <td><br />
[[Image:BlackBox.jpg|alt=Black Box]]<br />
</td> </tr><br />
<tr> <td align=center width=100px>Our strategy lies hidden behind a black box, waiting to be revealed.</td> </tr> </table></div>Linschttps://maslab.mit.edu/2011/wiki/Team_SixTeam Six2011-01-08T00:32:24Z<p>Xavier: added team members names</p>
<hr />
<div>Team SLAMBA<br />
<br />
Michael Olague<br />
<br />
Piper "Wings" Hunt<br />
<br />
Shawn Westerdale<br />
<br />
Xavier Jackson</div>Xavierhttps://maslab.mit.edu/2011/wiki/Build.xmlBuild.xml2011-01-06T20:25:06Z<p>Maslab: </p>
<hr />
<div><pre><br />
<project name="ant-tutorial" default="build" basedir="."><br />
<!-- CHANGE THESE THREE VALUES FOR AUTOMATIC UPLOAD --><br />
<property name="robotIP" value="18.62.31.60"/><br />
<property name="destDir" value="/home/maslab/code"/><br />
<property name="username" value="maslab"/><br />
<property name="binDir" value="bin"/><br />
<property name="srcDir" value="src"/><br />
<br />
<target name="build"><br />
<!-- This does deep dependency checking on class files --><br />
<depend srcdir="${srcDir}" destdir="${binDir}" cache="depcache" closure="true"/><br />
<!-- This compiles all the java --><br />
<javac srcdir="${srcDir}" destdir="${binDir}" includes="**/*.java" debug="true" classpath="lib/maslab.jar:lib/orc.jar"/><br />
</target><br />
<!-- Clean everything --><br />
<target name="clean"><br />
<delete><br />
<fileset dir="${binDir}" includes="**/*.class"/><br />
<fileset dir="${binDir}" includes="**/*~" defaultexcludes="no"/><br />
</delete><br />
</target><br />
<!-- Upload files to robot --><br />
<target name="upload" depends="build"><br />
<exec executable="rsync"><br />
<arg line="-e ssh -avr ${binDir} ${username}@${robotIP}:${destDir}"/><br />
</exec><br />
</target><br />
</project><br />
</pre></div>Maslabhttps://maslab.mit.edu/2011/wiki/Team_One/AssignmentsTeam One/Assignments2011-01-05T02:20:34Z<p>Eronsis: </p>
<hr />
<div>Scoring Strategy<br />
<br />
Upon activation, the bot will begin randomly driving around picking up balls with a roller as it drives over them and holds them in a storage area. It will not differentiate between our balls and the opposing team’s balls because it is advantageous to us to have both colors on the other side of the yellow wall. While collecting balls, the robot will watch for yellow walls even though it does not intend to shoot yet. If it sees any, it will attempt to record the location of the wall using a magnetometer (digital compass) and odometry so it may return to face the wall more quickly when the time comes.<br />
<br />
As the balls enter, they trip a switch so the robot can keep track of how many balls it’s holding. When it has picked up about 3 balls (exact quantity subject to change), it will switch to searching for a yellow wall to launch the balls over using a flywheel. The robot will look for the distance to the wall with an IR sensor and adjust the flywheel speed to ensure clearing the wall with minimal risk of overshooting. After expending its ammo, the robot will reenter search and gather mode.<br />
<br />
[[Image:harvester.png]]<br />
<br />
'''Team Calendar'''<br />
<br />
[[Image:Team Schedule.png]]<br />
<br />
'''Software Thread Structure'''<br />
<br />
[[Image:Software FSM.png]]</div>Eronsishttps://maslab.mit.edu/2011/wiki/Team_TenTeam Ten2011-01-04T20:54:49Z<p>Maslab10: </p>
<hr />
<div>Composed of: Arvin Shahbazi Moghaddam, Wojciech Musial, Tongji Li, Alex Teuffer<br />
<br />
Our Epic Journal! [http://maslab.mit.edu/2011/wiki/Team_Ten/Journal]<br />
or the assignments[http://maslab.mit.edu/2011/wiki/Team_Ten/Assignments]...</div>Maslab10https://maslab.mit.edu/2011/wiki/Team_Ten/AssignmentsTeam Ten/Assignments2011-01-04T19:56:31Z<p>Maslab10: </p>
<hr />
<div>'''January 04, 2011'''<br />
'''Tuesday 14:20'''<br />
'''Assignment 2'''<br />
----<br />
<br />
''Strategy'':<br />
So far, we've decided to concentrate on getting all the balls over the wall. However, we do have some other potential strategies about scoring in the goals that might be considered as well.<br />
<br />
''Software Design'': There will be several Java classes which will compose our robot's software. There will be a class which will process the optical as well as other sensor data from the cameras and sensors which will give the robot a sense of its surroundings. It will "tell" the robot the distance to the surrounding walls as well as the balls in sight. A control class will communicate with a driving class to decide the robot's movements based on what the sensors and camera detect. The robot will pick up balls and then decide what color the ball is and whether it should score it or keep it. There will also be a timer running which will control the strategy our robot will use (which varies with time).<br />
<br />
''Mechanical Outline'': Our robot will have three levels. The first level will be the entrance for the balls which will be collected using a motor turning a horizontal spindle of rubber bands mounted across the front of the robot. The balls will be pushed by this spindle onto a very short and low ramp after which the balls will roll into a channel that will lead the balls to a conveyor belt. This conveyor belt will be housed in a vertical half-pipe which will drop the balls onto the third level of our robot. This third level is essentially a box with no top which is inclined so that balls tend to roll to the front part of the robot. The very front edge of the third level of the robot will have a door like that of a pickup truck which lays flat to let out all of the balls that were collected.<br />
We want to keep the battery on the lowest level of the robot so that we can keep our center of mass closer to the ground. The camera, computer will be attached to the second level.<br />
<br />
''Schedule'' [[File:Schedule.jpg]]</div>Maslab10https://maslab.mit.edu/2011/wiki/Team_Six/AssignmentsTeam Six/Assignments2011-01-04T19:31:00Z<p>Wings: </p>
<hr />
<div>__TOC__<br />
<br />
==Checkoff One==<br />
<br />
Repository has our Hello World program.<br />
<br />
==Checkoff Two==<br />
<br />
Journal, day 2, has our plans for our robot.<br />
<br />
==Checkoff Three==<br />
<br />
Repository has our IR Drive code.<br />
<br />
==Checkoff Four==<br />
<br />
Result of our color detection: http://web.mit.edu/wings/Public/Maslab/picouttest.png<br />
<br />
==Rest of Checkoffs==<br />
<br />
See journal.</div>Wingshttps://maslab.mit.edu/2011/wiki/Team_ThirteenTeam Thirteen2011-01-04T17:47:28Z<p>Rhan: </p>
<hr />
<div>'''Lucky number thirteen. Hell yeah.'''<br />
<br />
<br />
Dmitri Megretski [dmegret@mit.edu]<br />
<br />
Rebecca Han [rebecca.han@mit.edu]<br />
<br />
Kevin Ellis [ellisk@mit.edu]<br />
<br />
Bayley Wang [bayleyw@mit.edu]</div>Rhanhttps://maslab.mit.edu/2011/wiki/Team_Thirteen/JournalTeam Thirteen/Journal2011-01-04T17:30:48Z<p>Rhan: </p>
<hr />
<div>== M. 01-03-2011 ==<br />
<br />
'''tl;dr''': Team 13 is operating at half-power for the first week; Kevin and Bayley will join us starting next Monday. Dima and Rebecca are working extra hard to meet the Checkpoints and be ready for Mock Contest 1.<br />
<br />
'''Current progress''': Absentees aside, we nonetheless completed Checkpoint 1 by the deadline. Pegbot is now capable of 1) moving in a circle or in a straight path and 2) communicating with eeePC via Bot Client.<br />
<br />
'''Future direction''': Strategy discussion. Rebecca will study what former groups have done, what has worked, what hasn't, etc. Our choice of strategy will likely dictate (and in turn, also be dictated by) the hardware details of our robot. Team 13 would like to receive input from all members, so efforts are being made to hold a real-time discussion with Kevin and Bayley. Dima hopes to have finished our robot by the end of the week.<br />
<br />
== T. 01-04-2011 ==<br />
<br />
'''tl;dr''': Successful analog input (specifically, a test voltage) reading from the orcboard. However, IRRangeFinder still eludes us. The value for range hovers around an apparently arbitrary ~0.12 for no good reason.<br />
<br />
'''Current progress''': Our robot is going to operate on a basic principle that it picks up as many balls as it can find via the use of a roller, elevates them a small distance by means of an Archimedes screw (inspired by Team 3 from 2003) and store them in a hopper where they will all be docked at the first goal/yellow-wall that the robot finds.<br />
<br />
The software architecture consists of several states - explore/random motion, wall follow, scan surroundings, and three separate action states to handle balls/goals/yellow walls - that will be controlled by timers. If the robot has spent a certain amount of time on one task unsuccessfully, it may be stuck and the timers are intended to protect against that. In addition, we plan on coding very robust explore/random and wall follow programs, relying on the camera vision coding heavily only when the robot stops intermittently to scan its surroundings. Having located something of interest, it then needs to drive in a straight line, which we can hopefully achieve by specifying velocity rather than WPM for the motors/wheels. <br />
<br />
As far as Checkpoint 3 is concerned, after a lot of confusion with ADC ports, it transpires that we have been specifying the wrong port, and hence have received no meaningful feedback from the AnalogInputs instantiated. Now we are getting a voltage reading that does respond accurately when we hook it up to a test circuit.<br />
<br />
'''Future direction''': Kevin and Bayley are still AWOL. Team 13 does not panic yet, however (although Team 13 is beginning to worry/wonder where the rest of the team is). Rebecca is going to spend tomorrow drafting a code for finding and approaching walls. Dima plans to finish CADing and ordering parts by tomorrow night, at the latest. Team 13 will take the pegbot into lab tomorrow afternoon and maybe get some help with the IR sensors. Probably will also set up and start working on the webcam and vision coding tomorrow night, after the Checkpoint.<br />
<br />
== W. 01-05-2011 ==<br />
<br />
== R. 01-06-2011 ==<br />
<br />
== F. 01-07-2011 ==<br />
<br />
== S. 01-08-2011 ==<br />
<br />
== M. 01-10-2011 ==</div>Rhanhttps://maslab.mit.edu/2011/wiki/Team_Seven/JournalTeam Seven/Journal2011-01-04T05:33:15Z<p>Rjmel: </p>
<hr />
<div>'''Pre-IAP:<br />
'''<br />
Our team met for 30 min on December 9th 2010 so we could talk about ideas and time availability of the team members.<br />
<br />
Present: N'Sink, Rafa, Roberto<br />
Absent: Javi<br />
<br />
'''MASLAB DAY 1:<br />
'''<br />
Due to problems in flights, half the team could not make it the first day. Rafa and Roberto worked tirelessly against bad firmware and annoying battery clips to make the robot talk to the computer as well as move the robot forward for the first time.<br />
<br />
After getting the first checkpoint done, N'Sink came back to Boston and sat down with PeloLoco to discuss mechanical designs. We considered numerous design strategies for a ball thrower but we came across many difficulties and decided to build a dribbling robot first, and then explore the possibility of extending it.<br />
<br />
Now off to a good nights sleep!<br />
<br />
'''MASLAB DAY 2 Jan 4th'''<br />
<br />
Worked on the CAD in the morning. We are using a rubber band mechanism as a dribbler and a foot by foot robot. After several flight and health problems our entire team was able to meet for the first time and worked in getting a good prototype done for the the coder to play with.<br />
<br />
[[File:CAd.jpg]]<br />
<br />
'''DAY 3 Jan 5th<br />
'''<br />
We got together and built a totally new pegbot in lab. We mounted the camera and one IR sensor and coded the robot to back up from a wall using the IR range.<br />
<br />
'''DAY 4 Jan 6th<br />
'''<br />
First parts of the dribbler were laser cut. Our two coders are tirelessly working on vision code which is being harder than what we thought. We decided to use as less screws as possible in mounting our real robot to be able to mount it/dismount it easily and quickly.<br />
<br />
'''DAY 5 Jan 7th'''<br />
The lower body for the robot was cut out of acrylic and final design details were finalized for the mounting of the dribbler. Coder figured out the vision code.<br />
<br />
<br />
'''Day 8 Jan 10th'''<br />
First Mock competition with our pegbot. The vision code was not implemented in this robot and we navigated via a random walk using the IR sensors. However right before the mock one of the IR sensors broke down and we had to run using only one. The pegbot was able to displace many balls and was also able to hold one ball under its possession. However the way that the wheels had been mounted was not very reliable and one of our wheels fell off. We learned a lot for our next robot.<br />
<br />
'''Day 9 Jan 11th'''<br />
We cut out a bigger and thicker chassis out of 3/8" Acrylic. We also redesigned the way the computer was going to be held in the robot. And for fun we cut out little acrylic ducks that took longer to cut than our chassis. Thus our name the mighty patos.<br />
[[File:Example1.jpg]]<br />
<br />
<br />
'''Day 10 Jan 12th'''<br />
Snow day!<br />
<br />
[[File:Example2.jpg]]<br />
<br />
'''Day 11 Jan 13th'''<br />
Building Day all-day. N'sink and PeloLoco spent all day at Edgerton making the robot. Wheels have been mounted and dribbler is almost finished. We need to get together tomorrow and finish the last small details to complete the mounting of sensors and the ball storage space.<br />
<br />
'''Day 12 Jan 14th'''<br />
Second mock contest. We spent all morning trying to finalize the details to make the robot operational for the mock. After many improvised solutions to our robot we competed in the mock and managed to collect 4 balls using only a random walk. We especially liked the aggressiveness of the robot.<br />
<br />
'''Day 13 Jan 15th'''<br />
Solved mounting of IR sensors and also built a more sturdy ball collection space using aluminum sheet.</div>Rjmelhttps://maslab.mit.edu/2011/wiki/Team_SevenTeam Seven2011-01-04T05:24:25Z<p>Rjmel: </p>
<hr />
<div>Team Seven Heaven;<br />
<br />
Members:<br />
<br />
Rafa "Noventa Mil" Crespo<br />
<br />
Roberto "PeloLoco" Melendez<br />
<br />
Javi "El Perdido" Ramos<br />
<br />
Christian "N'Sink" Segura</div>Rjmelhttps://maslab.mit.edu/2011/wiki/Checkoffs_and_Sensor_PointsCheckoffs and Sensor Points2011-01-04T03:01:12Z<p>Yichen: /* Seeding Results */</p>
<hr />
<div>==Sensor Points==<br />
Team 1: Extra motor, 2 extra drive wheels ~5 points<br />
<br>Team 2: Gyro, 2 IR sensors, 2 encoders, 3 long range IR, 2 extra motors (not drive) ~30 points<br />
<br>Team 3: Gyro, 2 Servos, 2 encoders, 4 IR sensors, 2 breakbeams ~28 points<br />
<br>Team 6: Gyro, 2 drive motors, 4 IR sensor, 1 servo ~35 points<br />
<br>Team 7: 5 IR sensors, 0 servo, 2 drive motors ~34 points<br />
<br>Team 9: Gyro, 2 Long range IR sensors, second battery charger, 2 encoders, 1 Servo, 3 motors ~28 points<br />
<br>Team 10: Gyro, Camera, 2 encoders, drive motor ~7 points<br />
<br>Team 11: Gyro, Drive Motor, 3 IR sensors, servo ~24 points<br />
<br>Team 13: 1 IR sensor, 2 encoders, 1 drive motor, 1 servo, 1 long, 2 short ~28 points<br />
<br />
==Checkpoint 1==<br />
Team 1 (checked off by Ellen)<br />
<br>Team 2 (checked off by Ellen)<br />
<br>Team 3 (checked off by Ellen)<br />
<br>Team 6 (checked off by Ellen)<br />
<br>Team 7 (checked off by Ellen)<br />
<br>Team 9 (checked off by Ellen)<br />
<br>Team 10 (checked off by Ellen)<br />
<br>Team 11 (checked off by Ellen)<br />
<br>Team 13 (checked off by Ellen)<br />
<br />
==Checkpoint 2==<br />
Team 1 (checked off by Ellen)<br />
<br>Team 2 (checked off by Ellen, David, Geza)<br />
<br>Team 3 (checked off by David, Geza)<br />
<br>Team 6 (checked off by David, Geza)<br />
<br>Team 7 (checked off by Ellen, Jessica)<br />
<br>Team 9 (checked off by Ellen, Jessica)<br />
<br>Team 10 (checked off by Ellen)<br />
<br>Team 11 (checked off by David)<br />
<br>Team 13 (checked off by Ellen, Buro)<br />
<br />
==Checkpoint 3==<br />
Team 1 (checked off by Ellen)<br />
<br>Team 2 (checked off by Eric)<br />
<br>Team 3 (checked off by Ellen)<br />
<br>Team 6 (checked off by Ellen)<br />
<br>Team 7 (checked off by Ellen)<br />
<br>Team 9 (checked off by Ellen)<br />
<br>Team 10 (checked off by Eric)<br />
<br>Team 11 (checked off by Ellen)<br />
<br>Team 13 (checked off by Eric)<br />
<br />
==Checkpoint 4==<br />
Team 1 (checked off by Ellen and Darthur)<br />
<br>Team 2 (checked off by Darthur)<br />
<br>Team 3 (checked off by Ellen)<br />
<br>Team 6 (checked off by Sam)<br />
<br>Team 7 <br />
<br>Team 9 (checked off by Darthur)<br />
<br>Team 10 (checked off by Darthur)<br />
<br>Team 11 <br />
<br>Team 13<br />
<br />
==Checkpoint 5==<br />
Team 1<br />
<br>Team 2 (checked off by Ellen)<br />
<br>Team 3 (checked off by Sam)<br />
<br>Team 6 <br />
<br>Team 7 (obviously they can do this. ^_^)<br />
<br>Team 9 (checked off by Eric)<br />
<br>Team 10 <br />
<br>Team 11 (checked off by Ellen)<br />
<br>Team 13 (obviously they can do this. ^_^)<br />
<br />
==Checkpoint 6==<br />
Team 1 (checked off by Ellen and Eric)<br />
<br>Team 2 (checked off by Ellen and Eric)<br />
<br>Team 3 (checked off by Ellen and Eric)<br />
<br>Team 6 (checked off by Ellen and Eric)<br />
<br>Team 7 (checked off by Ellen and Eric)<br />
<br>Team 9 (checked off by Ellen and Eric)<br />
<br>Team 10 (checked off by Ellen and Eric)<br />
<br>Team 11 (checked off by Ellen and Eric)<br />
<br>Team 13 (checked off by Ellen and Eric)<br />
<br />
=== Mock 1 Results ===<br />
<br />
{| class="wikitable"<br />
|-<br />
! Team<br />
! Score (Red)<br />
! Balls Displaced (Red)<br />
! Score (Green)<br />
! Balls Displaced (Green)<br />
|-<br />
| 1<br />
| 0<br />
| 0<br />
| 0<br />
| 0<br />
|-<br />
| 2<br />
| 1<br />
| 1<br />
| 3<br />
| 3<br />
|-<br />
| 3<br />
| 0<br />
| 1<br />
| 0<br />
| 0<br />
|-<br />
| 6<br />
| 0<br />
| 0<br />
| 0<br />
| 0<br />
|-<br />
| 7<br />
| 1<br />
| 1<br />
| 0<br />
| 0<br />
|-<br />
| 9<br />
| 0<br />
| 0<br />
| 0<br />
| 0<br />
|-<br />
| 10<br />
| 0<br />
| 0<br />
| 0<br />
| 0<br />
|-<br />
| 11<br />
| 1<br />
| 3<br />
| 0<br />
| 1<br />
|-<br />
| 13<br />
| 0<br />
| 0<br />
| 0<br />
| 0<br />
|}<br />
<br />
==Checkpoint 7==<br />
Team 1 (checked off by Sam and Jessica)<br />
<br>Team 2 (checked off by Sam and Jessica)<br />
<br>Team 3 (checked off by Sam, Jessica, and Ellen)<br />
<br>Team 6 (checked off by Sam and Ellen)<br />
<br>Team 7 (checked off by Sam, Jessica, and Ellen)<br />
<br>Team 9 (checked off by Ellen)<br />
<br>Team 10 (checked off by Ellen and Jessica)<br />
<br>Team 11 (checked off by Jessica and Sam)<br />
<br>Team 13 (checked off by Jessica and Ellen)<br />
<br />
=== Mock 2 Results ===<br />
<br />
{| class="wikitable"<br />
|-<br />
! Team<br />
! Score (Red)<br />
! Balls Displaced (Red)<br />
! Score (Green)<br />
! Balls Displaced (Green)<br />
|-<br />
| 1<br />
| 0<br />
| 1<br />
| 0<br />
| 1<br />
|-<br />
| 2<br />
| 2<br />
| 2<br />
| 19<br />
| 6<br />
|-<br />
| 3<br />
| 0<br />
| 0<br />
| 0<br />
| 0<br />
|-<br />
| 6<br />
| 0<br />
| 0<br />
| 0<br />
| 0<br />
|-<br />
| 7<br />
| 2<br />
| 3<br />
| 4<br />
| 4<br />
|-<br />
| 9<br />
| 0<br />
| 0<br />
| 0<br />
| 0<br />
|-<br />
| 10<br />
| 0<br />
| 0<br />
| 0<br />
| 0<br />
|-<br />
| 11<br />
| 0<br />
| 0<br />
| 0<br />
| 0<br />
|-<br />
| 13<br />
| 1<br />
| 1<br />
| 5<br />
| 5<br />
|}<br />
<br />
==Checkpoint 8==<br />
Team 1 (checked off by Ellen and Buro)<br />
<br>Team 2 (checked off by Ellen and Buro)<br />
<br>Team 3 (checked off by Ellen and Buro)<br />
<br>Team 6 (checked off by Ellen and Buro)<br />
<br>Team 7 (checked off by Ellen and Buro)<br />
<br>Team 9 (checked off by Ellen and Buro)<br />
<br>Team 10 (checked off by Ellen and Buro)<br />
<br>Team 11 (checked off by Ellen and Buro)<br />
<br>Team 13 (checked off by Ellen and Buro)<br />
<br />
=== Mock 3 Results ===<br />
<br />
{| class="wikitable"<br />
|-<br />
! Team<br />
! Score (Red)<br />
! Balls Displaced (Red)<br />
! Score (Green)<br />
! Balls Displaced (Green)<br />
|-<br />
| 1<br />
| 0<br />
| 0<br />
| 0<br />
| 0<br />
|-<br />
| 2<br />
| 21<br />
| 7<br />
| 25<br />
| 8<br />
|-<br />
| 3<br />
| 30<br />
| 6<br />
| 19<br />
| 4<br />
|-<br />
| 6<br />
| 1<br />
| 1<br />
| 1<br />
| 2<br />
|-<br />
| 7<br />
| 5<br />
| 5<br />
| 5<br />
| 6<br />
|-<br />
| 9<br />
| 0<br />
| 0<br />
| 0<br />
| 0<br />
|-<br />
| 10<br />
| 0<br />
| 0<br />
| 0<br />
| 0<br />
|-<br />
| 11<br />
| 0<br />
| 0<br />
| 0<br />
| 0<br />
|-<br />
| 13<br />
| 0<br />
| 0<br />
| 0<br />
| 0<br />
|}<br />
<br />
== Seeding Results ==<br />
<br />
{| class="wikitable"<br />
|-<br />
! Team<br />
! Score (Red)<br />
! Balls Displaced (Red)<br />
! Score (Green)<br />
! Balls Displaced (Green)<br />
! Total Score<br />
|-<br />
| 1<br />
| 2<br />
| 4<br />
| 0<br />
| 2<br />
| 2 (6)<br />
|-<br />
| 2<br />
| 44<br />
| 9<br />
| 31<br />
| 6<br />
| 75 (15)<br />
|-<br />
| 3<br />
| 12<br />
| 7<br />
| 0<br />
| 2<br />
| 12 (9)<br />
|-<br />
| 6<br />
| 0<br />
| 0<br />
| 0<br />
| 0<br />
| 0 (0)<br />
|-<br />
| 7-A<br />
| 4<br />
| 6<br />
| 1<br />
| 2<br />
| 5 (8)<br />
|-<br />
| 7-B<br />
| 0<br />
| 1<br />
| 0<br />
| 1<br />
| 1 (1)<br />
|-<br />
| 9<br />
| 0<br />
| 0<br />
| 0<br />
| 0<br />
| 0 (0)<br />
|-<br />
| 10<br />
| 1<br />
| 1<br />
| 1<br />
| 1<br />
| 2 (2)<br />
|-<br />
| 11<br />
| 5<br />
| 5<br />
| 6<br />
| 1<br />
| 11 (6)<br />
|-<br />
| 13<br />
| 13<br />
| 9<br />
| 10<br />
| 10<br />
| 23 (19)<br />
|}<br />
<br />
== Impounding ==<br />
<br />
{| class="wikitable"<br />
|-<br />
! Team<br />
! Team Picture and T-shirts<br />
! Interview<br />
! Sensor Points<br />
! Impound<br />
|-<br />
| 1<br />
| Yes<br />
| Yes<br />
| 17<br />
| Yes<br />
|-<br />
| 2<br />
| Yes<br />
| Yes<br />
| 31<br />
| Yes<br />
|-<br />
| 3<br />
| Yes<br />
| Yes<br />
| 22<br />
| Yes<br />
|-<br />
| 6<br />
| Yes<br />
| Yes<br />
| 14<br />
| Yes<br />
|-<br />
| 7-A<br />
| Yes<br />
| Yes<br />
| 12<br />
| Yes<br />
|-<br />
| 7-B<br />
| Yes<br />
| Yes<br />
| 20<br />
| Yes<br />
|-<br />
| 9<br />
| Yes<br />
| Yes<br />
| 42<br />
| Yes<br />
|-<br />
| 10<br />
| Yes<br />
| Yes<br />
| 9 + Camera<br />
| Yes<br />
|-<br />
| 11<br />
| Yes<br />
| Yes<br />
| <br />
| Yes<br />
|-<br />
| 13<br />
| Yes<br />
| Yes<br />
| 30<br />
| Yes<br />
|}</div>Yichenhttps://maslab.mit.edu/2011/wiki/Team_Two/AssignmentsTeam Two/Assignments2011-01-04T02:58:02Z<p>Cathywu: /* Software Design */</p>
<hr />
<div><br />
== Strategy ==<br />
<br />
Get balls over walls<br />
<br />
<br />
<br />
== Mechanical Design ==<br />
<br />
Our robot serves as a reliable platform for the software vision and control systems. As such it should be sturdy, constructed quickly, have extremely low mechanical failure rates, be able to withstand hours of testing, and be robust to positioning errors.<br />
<br />
The robot's structural members will be built primarily from acrylic sheet. It will utilize a rubber band roller powered by a DC motor to collect balls and 4-bar linkage hopper actuated by a servo to get balls over the wall. DC geared motors will drive no-slip wheels.<br />
<br />
The robot's sensor suite will include the camera, 5 IR sensors, the gyroscope, encoders, and 2 bump sensors for scoring alignment.<br />
<br />
== Software Design ==<br />
<br />
Use a simple state machine. Have a good testing suite and several debugging tools. Write fast vision code. Fork off threads for camera and other sensors to operate.<br />
<br />
== Schedule ==<br />
<br />
[[File:Calendar.png|1600px|thumb|left|The calendar we set out to follow, which also turned out to be pretty accurate to what we did.]]</div>Dfouriehttps://maslab.mit.edu/2011/wiki/Team_Two/JournalTeam Two/Journal2011-01-04T01:51:39Z<p>Dfourie: /* Day 26, Friday January 28 */</p>
<hr />
<div>== Legend ==<br />
(L)eighton, (D)an, (S)tan, (C)athy<br />
<br />
== Pre IAP ==<br />
<br />
D, L, S, C met a couple times for organizational purposes before winter break. Video conferenced with strategy and design details during winter break after each reading a few old papers and giving some thought to MASLAB<br />
<br />
D designed most of the mechanical aspects of the robot and drew it out in SolidWorks<br />
<br />
L looked into better batteries and encoders. Read about sensors<br />
<br />
C went through the vision tutorial and wrote some custom vision code, started setting up development environment (git, eclipse, ant, botclient, feh, v4l4j, etc.), started sketching out software architecture<br />
<br />
== Day 1, Monday January 3 ==<br />
L, C we attended lecture and then assembled our pegbot and got it driving. Trying to get a jump on the work ahead of us, we took pictures of the field and of the balls with our webcam for vision testing and started to wire up several IR sensors and a gyro.<br />
<br />
D finalized 4-bar synthesis for the mechanism that raises the ball hopper up over the wall. He also figured out how everything will fit inside the robot's circular footprint.<br />
<br />
== Day 2, Tuesday January 4 ==<br />
<br />
D got the majority of the detailed mechanical design finished.<br />
<br />
C tried to fix a problem with the v4l4j library by setting up a VM with Ubuntu 10,10 and a whole new dev environment on her machine, since certain camera functionality didn't work without v4l4j and v4l4j didn't work with her version of Ubuntu.<br />
<br />
C helped S and L set up eclipse projects and git working directories.<br />
<br />
C, S, L discussed software design.<br />
<br />
C went to get shop-trained.<br />
<br />
L rigged up some sensors.<br />
<br />
== Day 3, Wednesday January 5 ==<br />
<br />
D did more detailed CAD, set up files for rapid cutting, and created an assembly plan so we can start building this weekend.<br />
<br />
L, C attended lecture, troubleshooted and tested sensors (IR, gyro), worked through checkpoint 3<br />
<br />
L assembled encoders, planning to put them on pegbot tomorrow<br />
<br />
S worked on setting up software framework<br />
<br />
C got camera working with personal machine (thanks, staff!), configured ant, set up convenient routine for running robot remotely<br />
<br />
== Day 4, Thursday January 6 ==<br />
<br />
L, S, C attended lecture<br />
<br />
L mounted encoders, wired up LED for debugging, helped debug checkpoint 4<br />
<br />
S continued working with software framework: main loop, state machine, sensor abstraction<br />
<br />
C wrote and tested a basic PID controller, worked on checkpoint 4, wrote some edge detection code<br />
<br />
== Day 5, Friday January 7 ==<br />
<br />
D, L, S, C we met to get D up to speed, since he got in last night<br />
<br />
D measured components to make sure that they fit the mechanical design, made a prototype roller to test out functionality<br />
<br />
L wired up a controller for powering the roller's motor, helped C debug checkpoint 5<br />
<br />
S helped C debug checkpoint 5, worked on wall following code<br />
<br />
C worked on checkpoint 5, eventually got a PD controller running that tracks and approaches a red ball<br />
<br />
== Day 6, Saturday January 8 ==<br />
<br />
D battled with obstreperous machines and got everything cut.<br />
<br />
L built a controller for our fourth motor, did other random electronics stuff, and built our very own field.<br />
<br />
C started writing sensor classes and finished checkpoints 4 and 5.<br />
<br />
S attacked wall-following.<br />
<br />
== Day 7, Sunday January 9 ==<br />
<br />
D machined hubs and didn't cut the hopper because he didn't have 1/16" Al.<br />
<br />
D and L did post-processing on the cut materials and assembled the majority of the robot. Left to do is mounting the hopper.<br />
<br />
C finished up sensor classes and wrote and tested color calibration code.<br />
<br />
== Day 8, Monday January 10 ==<br />
<br />
C wrote the checkpoint 6 code (button start, timing, color calibration) and wrote the ball picking up code in preparation for the mock competition.<br />
<br />
S finished some classes for velocity control and wall following (still untested).<br />
<br />
L mounted the buttons to start the robot for the mock competitions.<br />
<br />
C, L and S debugged the robot at the mock competition. Then we won the mock competition!<br />
<br />
C, S divided up the rest of the software areas.<br />
<br />
== Day 9, Tuesday January 11 ==<br />
<br />
L wired up limit switches and bump sensors and prototyped the break-beam sensor.<br />
<br />
D cut and installed the hopper and worked on the ball intake ramp.<br />
<br />
S debugged wall-following control.<br />
<br />
C wrote software to optimize image processing and ported our mock 1 code to the state machine framework.<br />
<br />
== Day 10, Wednesday January 12 ==<br />
<br />
C got most of the rest of the vision work done, including smoothing, down-sampling, and worked on blue-line filtering.<br />
<br />
S managed to get real-time parameter updating working! and succeeded with wall-following.<br />
<br />
L did lots of wiring, installed break beam sensor, defined I/O ports, got the roller motor PWM circuit going.<br />
<br />
D installed curved ball guide, wired up the new battery pack, and tweaked various things on the robot.<br />
<br />
== Day 11, Thursday January 13 ==<br />
<br />
C tested the code from mock 1 that was ported to the state machine framework, worked on more sensor abstractions, implemented timeouts.<br />
<br />
S finished writing wall following.<br />
<br />
D, L, S, C tested wall following on a simple field. <br />
<br />
C wrote scoring code.<br />
<br />
D, L, S, C tested wall following, picking up balls and scoring all together.<br />
<br />
== Day 12, Friday January 14 ==<br />
<br />
L, S, C tested, tweaked our code from the first mock competition ported over to a state machine framework.<br />
<br />
L, C tested, tweaked our scoring mechanism and code<br />
<br />
S, C made a state machine that incorporates wall following, mock competition 1, and scoring.<br />
<br />
S debugged wall following<br />
<br />
C added timeouts in code, wrote better color calibration utility<br />
<br />
L, D, S, C tested, tweaked, tested, broke the robot (oops), tested, tested, tested, won mock 2, got a lot of sleep<br />
<br />
D, L repaired the robot<br />
<br />
== Day 13, Saturday January 15 ==<br />
A slow day for us. We recovered from mock comp 2 and the all nighter that came before.<br />
<br />
D fabbed robust "whiskers" for the bump sensors. Made a pack of A123 lithium polymer batteries that the bot could use.<br />
<br />
L prototyped replacing encoders with geartooth break beam sensors. Attempted to acquire a static IP on MIT's wireless. Started rewiring.<br />
<br />
S started to revise the wall-following code after mock 2.<br />
<br />
C started improving vision code w/ down-sampling and reduced the number of stills sent to botclient.<br />
<br />
== Day 14, Sunday January 16 ==<br />
<br />
S coderodered all night and day and made a stronger, faster, smoother, better wall following state.<br />
<br />
L got break beam encoders working!<br />
<br />
D made more battery options.<br />
<br />
C added a better vision calibration test, and made vision processing faster.<br />
<br />
== Day 15, Monday January 17 ==<br />
<br />
L, S fixed wall following - now it's a lot smoother. <br />
<br />
S started working on a wheel velocity controller. Tabled it -- encoders aren't good enough, and motion is fine for now.<br />
<br />
C started fixing synchronization issues in the vision code. Before, the robot was acting on incomplete vision data. Goal is to only pull complete (but slightly old) vision data.<br />
<br />
C implemented more ball fetching and scoring logic and LED debugging (color indicators), started optimizing generation of RGB to HSV lookup table, trimmed timeouts<br />
<br />
== Day 16, Tuesday January 18 ==<br />
<br />
D, S, L, C tested and tweaked robot, modified gains, broke the robot, fixed the robot, test test test<br />
<br />
C fixed concurrent programming issues, worked in stuck handling, bump detection, added more SM logic, added some random behavior<br />
<br />
C, S worked on stuck detection<br />
<br />
== Day 17, Wednesday January 19 ==<br />
Mock 3 and sponsor dinner! We went in and did our run early in an attempt not to spend hours optimizing to each field.<br />
<br />
== Day 18, Thursday January 20 ==<br />
A slow day after mock 3 and sponsor dinner. We have begun to pinpoint failure modes and work relentlessly to resolve them.<br />
<br />
C worked on getting stuck detection with respect to bump sensors working robustly. Wrote code for new bump sensors. Implemented basic memory scheme used exclusively for helping stuck detection. Reserved 26-100 for a full day of testing on Sunday.<br />
<br />
S worked on stuck detection with respect to motor currents.<br />
<br />
L was a debug demon.<br />
<br />
D cut and installed new back angled plate, new front bump sensor, and moved angled IR sensors to the top plate.<br />
<br />
== Day 19, Friday January 21 ==<br />
<br />
S got relatively robust stuck detection using motor current and a double filter (sliding window + min time steps above threshold. Started working on encoder tick visualization and stuck detection. Started straighter wall following.<br />
<br />
D and L mounted side bump sensors on the top half of the robot.<br />
<br />
C fixed false scoring where the robot would hit a tower and see the yellow wall and try to score. More reliable scoring over convex yellow walls. More reliable ball counting with breakbeam. Faster ball scanning.<br />
<br />
== Day 20, Saturday January 22 ==<br />
<br />
D fixed the hopper, which was starting to get stuck when lowering<br />
<br />
D, L, C debugged<br />
<br />
S continued work on motor current<br />
<br />
C improved scoring, ball collection, ball counting reliability and logic, added some time-based strategies, added code to prevent balls from jamming under the hopper<br />
<br />
== Day 21, Sunday January 23 ==<br />
With the blessing of MIT and the MASLAB staff we set up a full field in 26-100 and tested from 11am to 5am. The major fix was of the day was ball-jamming, which is no longer a significant problem thanks to some repositioning and a ramp to block balls from going underneath the hopper while it is extended.<br />
<br />
== Day 22, Monday January 24 ==<br />
dan<br />
<br />
== Day 23, Tuesday January 25 ==<br />
stan<br />
<br />
== Day 24, Wednesday January 26 ==<br />
<br />
S, L, D, C continued work on final paper<br />
<br />
D switched out the drive motor that didn't break during the seeding tournament<br />
<br />
== Day 25, Thursday January 27 ==<br />
leighton <br />
<br />
== Day 26, Friday January 28 ==<br />
WE WON!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!1</div>Lbarneshttps://maslab.mit.edu/2011/wiki/Team_Ten/JournalTeam Ten/Journal2011-01-04T00:15:43Z<p>Maslab10: </p>
<hr />
<div>'''Monday January 3rd, 2011'''<br />
<br />
We managed to write the HelloWorld code, implement a simple robot, and run a simple program on its motors to show that it works. In the way of doing this, we confronted two problems. First problem was that we were unable to communicate with the uorc board. The problem was with the board and was solved by short circuiting the Estop. The second problem was that the motors were connected so tight to the wooden board that they did not move. We solved the problem by loosing them. We're hungry!!!<br />
<br />
----<br />
<br />
'''Tuesday January 4th, 2011'''<br />
<br />
We have discussed strategy, structural and mechanical design and are still discussing strategy!! Plus we are deciding who gets to keep their computer plugged in to the wall. Unfortunately, plugs only come in pairs. We have a really good schedule set up and are planning to follow it even if it mean building the robot in the next 5 days! However, we are quite pleased with our design and feel that any obstacles will only make us stronger!!<br />
<br />
----<br />
<br />
'''Wednesday January 5th, 2011'''<br />
<br />
Today we completed checkpoint 3. We ate subway sandwiches. Two of us got trained at Edgerton. We chose the design for our robot. We failed at naming our robot. Voitek is trying to do all the coding tonight for both checkpoint 4 and 5. YES! <br />
The design of our robot consists of two Archimede's screws and a collection basket on the top level of our robot to throw balls over the opposing team's wall. Simple and effective<br />
Over and out.<br />
<br />
<br />
----<br />
<br />
'''Thursday January 6th, 2011'''<br />
<br />
Completed two archimede's screws which are the direct opposite of each other. Preliminary ball elevating operations succeeded but we need to attach three shafts for it to work correctly.<br />
We completed two checkpoints in one day today as well (4 and 5).<br />
<br />
<br />
----<br />
<br />
'''Friday January 7th, 2011'''<br />
<br />
Completed two shafts that will make the archimede's screws spin. <br />
Ordered gpu online. <br />
Began work on encoders.<br />
<br />
<br />
----<br />
<br />
'''Saturday January 8th 2011'''<br />
<br />
Continued work on encoders.<br />
<br />
----<br />
<br />
''' Sunday January 9th 2011'''<br />
<br />
Continued work on encoders.<br />
Cut wood for the screw casing which will be attached to the back of the robot. <br />
We plan to have an operational robot by tuesday.<br />
<br />
----<br />
<br />
''' To be continued :] '''<br />
<br />
----<br />
<br />
''' Tuesday January 11th '''<br />
<br />
After some tinkering it seems that 1 screw will work much better than 2 screws, especially in terms of stability. So we reconfigured the designs. <br />
<br />
----<br />
<br />
''' Wednesday January 12th '''<br />
<br />
Epically snowed out of the lab.<br />
<br />
----<br />
<br />
''' Friday January 14th 2011 '''<br />
<br />
Mock Contest today!! So we found out that our robot was slightly too close to the ground so we had to cut the guides on the bottom of the robot. So we took the robot apart after the mock to work on it. During the interview we put our robot together for a picture and it looked awesome!( with the exception of no orc board, computer or battery ) It looked as though our robot will be running by will power!!<br />
Front door was made and hopefully trap all the balls bwahahaha... <br />
----<br />
<br />
''' Saturday January 15th 2011 '''<br />
<br />
Took robot apart again :] this time to put on the new epic mice encoders. They are attached with a spring system making them versatile to the change in terrain. Now just gotta code them perfectly >.<<br />
<br />
----<br />
<br />
''' Sunday January 16th 2011 '''<br />
<br />
Wrote code for PID!!<br />
<br />
----<br />
<br />
''' Monday January 17th 2011 '''<br />
<br />
Measured camera angles and positions relative to our center for parameters of the vision program. Found an infinite loop in the vision program. Everything should be working better. Although mice are not being cooperative and only one work at a time.<br />
<br />
----<br />
<br />
''' Tuesday January 18th 2011 '''<br />
<br />
Found that mice input different data than what the program expected and that's why we couldn't find the mice input when we initiate the program! Calibrating our cameras by using a graphical ground to travel on.<br />
<br />
----<br />
<br />
''' Wednesday January 19th 2011 '''<br />
<br />
So the robot will be name HITMAN: AGENT 47<br />
With some more coding we may be able to not only graphically show the audience the where the balls are but also a map of the maze that we have traversed so far. And keep calibrating the cameras!!<br />
<br />
----<br />
<br />
''' Friday January 21th 2011 '''<br />
<br />
Front gate is officially on!! And it works quite well?! Less than a dollar's worth of wire and cardboard with a bit of hot glue does the job! The mother board and orc are officially installed and ready to go!<br />
<br />
----<br />
<br />
''' Saturday January 22nd 2011 '''<br />
<br />
So we are getting new battery supply!! 2 sets of car batteries which will charge the motherboard and orc board! They even fit quite well too! And we still have to keep calibrating the cameras for the best results.<br />
<br />
----<br />
<br />
''' Monday January 24th 2011 '''<br />
<br />
The gear seem to have stopped staying attached to the screw. We can fix that! Also, we put sliders on the mice to prevent it from getting stuck and breaking. More camera calibration is in effect :]<br />
<br />
----<br />
<br />
''' Tuesday January 25th 2011 '''<br />
<br />
Mock 4.</div>Maslab10https://maslab.mit.edu/2011/wiki/Team_Three/JournalTeam Three/Journal2011-01-03T23:19:46Z<p>Cloitre: </p>
<hr />
<div><h3>January 3rd: Where Everything Started</h3><br />
<br><ul><br />
<li>Day one. Team is ready. Team is willing to show off... but uORC is not.</li><br />
<li>After assembling the pegbot and linking the (fancy) eeePC to our PC via the bot client, we encountered a serious problem. The uORC would not respond because of firmware.</li><br />
<li>Good news : the same program works on a previous version of the uORC (all our work is not vain !)</li><br />
<li>Final positive event : we tore down the webcam in order to get the essential (the printed circuit) and it still works.</li><br />
</ul><br />
<br><hr><br><br />
<br />
<h3>January 4th: The Grand Design</h3><br />
<br><ul><br />
<li>Day two. Met in the morning, discussed the game. We have a plan: watermill+waterpark slides+catapult style robots with layers, just like an onion (or a cake).</li><br />
<li>After the group interview, we put all our sensors together and tested them. Everything works! We are on a roll.</li><br />
<li>What's next? CADed most of the components we have already today and will finish the robot architecture tomorrow.</li><br />
</ul><br />
<br><hr><br><br />
<br />
<h3>January 5th: The Grand Design Part II</h3><br />
<br><ul><br />
<li>CADing more stuff and started vision code</li></ul><br><hr><br><br />
<br />
<h3>January 6th: The All Seeing Eye</h3><br />
<br><ul><br />
<li>Yesterday involved much work, and we were all to tired to journal.</li><br />
<li>Video code and Cinnamon Toast Crunch abounds, and we are recognizing objects and taking names. (Of the objects)</li><br />
<li>With the rest of tonight we'll be wrapping up the PID controller using the quad encoders -- so far seems good. Tomorrow during the day, we will combine this with the camera error and be ready to win Mock 1.</li><br />
</ul><br />
<br><hr><br><br />
<br />
<h3>January 7th: The Puppy</h3><br />
<br><ul><br />
<li>Some camera/PID communication code went slightly kaput at the last minute, so we took the less elegant approach of not using PID for Checkpoint 5. We still completed it just fine, found balls, etc -- and noted that the robot looked like a puppy chasing balls.</li><br />
<li>Everybody is still helping out with the CAD design, which is looking more and more like a finished product. Where to put the laptop, though?</li><br />
<li>Some people will be working on finishing the CAD and fabricating the robot, while some are working on making the pegbot able to score some goals to win pizza at Mock 1.</li><br />
</ul><br />
<br><hr><br><br />
<br />
<h3>January 8th: Sensory Saturday </h3><br />
<br><ul><br />
<li>Time to place sensors on the pegbot. Lets check all of them, two encoders, gyro, bumps, IR, breakbeam.</li><br />
<li>Audren and Faye are working around the clock to finish the robot from their CAD models. Final check on the CAD design before sending it to the lasercutter.</li><br />
<li>??? What is the best way to design the ball ramp? Respects to waterpark designers. </li><br />
<li>??? What is the best way to design the ball scooper? Thinking strongly of using a kitchen utensil, can you guess which one?</li><br />
</ul><br />
<br><hr><br><br />
<br />
<h3>January 9th: CADing Continues </h3><br />
<br><ul><br />
<li>Unsatified with the design, more modifications have been made. </li><br />
</ul><br />
<br><hr><br><br />
<br />
<h3>January 10th: Mock 1 </h3><br />
<br><ul><br />
<li>Lasercutting delayed because with the preset parameters, the 1/4" acrylic didn't get cut all the way through.</li><br />
<li>Mock 1 run does exactly what we want it to do. Spin and aim for the closest ball. We displace one. Yes! </li><br />
<li>Filling our bellies with our own pizza, Feta and Jalapeno, Sausage and Mushroom, coding continues. </li><br />
</ul><br />
<br><hr><br><br />
<br />
<h3>January 11th: Lasercutter Mastered </h3><br />
<br><ul><br />
<li>Central Machine Shop is out of 1/4" acrylic, so we make due with 3/8". To the reader: Buy acrylic early.</li><br />
<li>Cut, cut, cut ... </li><br />
</ul><br />
<br><hr><br><br />
<br />
<h3>January 12th: Snow Day </h3><br />
<br><ul><br />
<li>[http://voices.washingtonpost.com/capitalweathergang/2011/01/snowstorm_breezes_through_nyc.html Boston Buried, We Were Trapped]</li><br />
<li>Audren walked to lab and found it empty. Proceed to assembly the camera mount. </li><br />
<li>Jamie is coding up a storm -- PID, wall detecting, ball finding -- all a piece of cake.</li><br />
<li>Chances at winning Mock 2 without a properly tested robot are small, but we might be able to pull it off.</li><br />
</ul><br />
<br><hr><br><br />
<br />
<h3>January 13th pm - 14th am: Race to Mock 2</h3><br />
<br><ul><br />
<li>Coding happened all night long, and a lot got done, but there are still some screw-ups, along with the finished robot product being different from the testing platform</li><br />
<li>After debugging some crashing into wall behavior for quite some time, it was found that a wheel was loose. Hard to avoid walls with a loose wheel; that's what I always say.</li><br />
<li>Robot completion is nearing in the AM. For Mock 2 we will not be able to score over the yellow wall, however we plan to use our incomplete robot to score in regular goals. Retrofitting of a makeshift arm+servo mechanism is underway, and could be promising.</li><br />
<li>A Mock 2 victory is looking very distant at the moment. However, unlike Mock 1, it is a possibility.</li><br />
</ul><br />
<br><hr><br><br />
<br />
<br />
<h3>January 14th: Mock 2</h3><br />
<br><ul><br />
<li>Whole day assembly in preparation for the mock competition. No ball slide. Arm hastily put together. No drawbridge. Minimal scoring mechanism. </li><br />
<li>Ready, Set, Charge, ---> Ram the Wall</li><br />
<li>Wheel determined to be too low and would get stuck on carpet bumps. Plan to raise in next CAD revision.<br />
<li>Before final run, Wheel Falls Off</li><br />
<li>Hurried off to Rental Equipment Fitting. (We are going on a team ski trip)</li><br />
</ul><br />
<br><hr><br><br />
<br />
<br />
<h3>January 15th: Extreme Makeover - The Robot Edition</h3><br />
<br><ul><br />
<li>+BallSlide, ++Drawbridge</li><br />
<li>CADing new wheel design</li><br />
</ul><br />
<br><hr><br><br />
<br />
<br />
<h3>January 16th: Second Sunday</h3><br />
<br><ul><br />
<li>Bought an Al tube as backup for the BallSlide</li><br />
<li>Final final CAD layout</li><br />
</ul><br />
<br><hr><br><br />
<br />
<h3>January 17th: Tinkering</h3><br />
<br><ul><br />
<li>Lasercut new wheels</li><br />
<li>Playing with all the bump sensors, Count:8</li><br />
<li>Attempted arm design and finished drawbridge design</li><br />
</ul><br />
<br><hr><br><br />
<br />
<h3>January 18th - 19th am: Race to Mock 3</h3><br />
<br><ul><br />
<li>Finished arm design</li><br />
<li>Remounted bump sensors because the epoxy came off.</li><br />
<li>All-Night Code</li><br />
</ul><br />
<br><hr><br><br />
<br />
<h3>January 19th: Mock 3</h3><br />
<br><ul><br />
<li>Found out that the bump sensors were mounted above the wall, and need to lower them.</li><br />
<li>Give the Monsieur a moustache.</li><br />
<li>Change strategy at the last moment to WIN!</li><br />
<li>Food was delicious, sponsors liked our robot.</li><br />
</ul><br />
<br><hr><br><br />
<br />
<br />
<h3>January 20th: Code code code !</h3><br />
<br><ul><br />
<li>As Monsieur Robot is somehow operationnal, James could test his code on it<br />
<li>MechE designers chill out<br />
</ul><br />
<br><hr><br><br />
<br />
<br />
<h3>January 21th: Rush before skiing</h3><br />
<br><ul><br />
<li>Code keeps on improving<br />
<li>Start the design of an acrylic ramp that would be more reliable than the wooden prototype<br />
<li>Left lab early to catch the bus for Sunday River.<br />
</ul><br />
<br><hr><br><br />
<br />
<h3>January 22nd to 24th: On the slopes by -20 degree Fahrenheit</h3><br />
<br><ul><br />
<li>The team takes some rest before the last week.<br />
<li>Sunday river is awesome.<br />
</ul><br />
<br><hr><br><br />
<br />
<h3>January 25th: Back in the business</h3><br />
<br><ul><br />
<li>Back to code improvement<br />
<li> The design of the acrylic ramp is finalized <br />
</ul><br />
<br><hr><br><br />
<br />
<h3>January 26th: Fixing details</h3><br />
<br><ul><br />
<li>The ramp is laser cut and assembled. It works just fine<br />
<li> Breakbeam sensors are built and mounted on Monsieur Robot. Now we can catch balls more reliably.<br />
<li> the two front bump sensors are replaced by new ones. They are more sturdy and look like they can handle several hours of use.<br />
</ul><br />
<br><hr><br><br />
<br />
<h3>January 27th: What else?</h3><br />
<br><ul><br />
<li>A small modification in the code makes a huge difference. Monsieur Robot goes slower and makes less mistakes<br />
<li>The wheels became loose again. We decide to change the screws with unused ones and loctite them. we never had any problem with those again.<br />
<li> We impound our Monsieur robot for the night. Tomorrow is gonna be a big day for him!<br />
</ul><br />
<br><hr><br><br />
<br />
<h3>January 28th: This it!</h3><br />
<br><ul><br />
<li>Final competition. We finish 2nd after losing twice against team 2 at the winner bracket final and overall final.<br />
<li>We did not encountered any kind of failure during the entire competition nor changed the code in between rounds. Monsieur Robot worked perfect!<br />
<li> Most importantly, Monsieur Robot won the best dressed award - the secret goal of our team from the very beginning :)<br />
</ul><br />
<br><hr><br><br />
<br />
<br />
<!-- Title: The final Cut --></div>Cloitrehttps://maslab.mit.edu/2011/wiki/Team_Six/JournalTeam Six/Journal2011-01-03T19:31:15Z<p>Wings: </p>
<hr />
<div>__TOC__<br />
<br />
==January 3rd, 2011==<br />
<br />
First time writing the date this year, and I (Piper) got it right! W00! Anyway, our team began Maslab in an extremely sleep-deprived state, which made things very amusing. (I was incredibly giggly...) (I am journaling this sleep-deprived. Expect lots of exclamation points and smiley faces!)<br />
<br />
Problems we ran into with our code: our "Hello, World!" statement won't print without being in an infinite loop. We're not entirely sure why, but this problem didn't carry over to our Drive class (which we used to get our second component of the checkoff - the robot drove forward for three seconds). This is a good thing, because interrupting the infinite loop didn't work, and we can't make infinity last only three seconds :). After a little debugging, our code successfully drove the robot forward for three seconds and stopped.<br />
<br />
With the actual board, we were able to attach our wheels, motors, and castor to our base. We had trouble securing all our wires to the microboard, since the crimping didn't seem to clamp the wires down all the way. We also had to short our emergency stop (due to Maslab adjusting some of the code this year) before our robot could work. But worked it did! And we got a checkoff! And then it broke again! After replacing a fuse and soldering the insides of our battery clips, our attachments were more secure and the robot worked again :).<br />
<br />
<br />
'''[x] Since one of our motors were initially wired backwards, our ground and power do not follow the standard color convention. We should rewire this.<br />
<br />
'''[x] Double check to see that when our code decides the robot is moving forward, the robot is moving forward instead of backwards.<br />
<br />
==January 4th, 2011==<br />
<br />
'''Possible algorithm for robot''': The algorithm runs as long as the timer is noted less than 3 minutes (180 seconds). The first ball seen will have its color noted and be saved in a variable called our_color. As soon as this is established, search the map for goals, the goals are determined as follows: if a yellow wall is seen, drive up to it and use the ir sensor to determine whether or not the depth of the wall varies along its length. If the wall does vary, save the location of the goal in a list called goals_loc[]. When the number of goals is greater than 2 (or if more than 30 seconds have elapsed), then begin to look for balls. Whenever a ball is found, look for the nearest goal and transfer the ball to that goal. Do this as long as the timer has not gone over 2 minutes. After two minutes, whenever a ball is found, if the distance between the ball and the nearest known wall to the opponents side is less than the distance to the nearest known goal, then save the distance and calculate a random number between 0 and 1 and save it to rand_n. Should rand_n > e ^ -(d_togoal-d_towall), throw the ball over the wall, else, throw the ball into the goal. Stop after 3 minutes.<br />
<br />
'''Robot strategy''': We decided that we are stronger on the course 6 side of the spectrum than the course 2 side of the spectrum. We're going to keep our robot design relatively simple, with a conveyor belt and accompanying pinball-machine-like doors to pull balls into the robot and drop them into a compartment. On the side, we will have a door that opens when told so that we can drop all our balls into the goal. We decided not to drop balls onto the other side in order to keep our robot simple so that we can focus on our code.<br />
<br />
'''Schedule'''<br />
<br />
* Michael - Since Maslab is his main IAP commitment, he'll code in the evenings and possibly during the day. <br />
* Shawn - Work until 4pm every day, code at night with Michael.<br />
* Piper - Work on building between lecture (or late mornings when lecture isn't happening) and 7pm daily.<br />
* Xavier - Work on building between lecture (or late mornings when lecture isn't happening) and 7pm daily.<br />
<br />
'''Other'''<br />
* Piper did shop training today.<br />
* Xavier did laser training today.<br />
<br />
<br />
'''[x] TA gave us a suggestion for fixing the code problem we had on the first day. Try it out. [it works now! yay!]'''<br />
<br />
== January 5th, 2011 ==<br />
<br />
'''New robot strategy''': We came to realize that our original robot design was problematic. Our design had a lot of failure modes. The conveyor belt roller would have to be very small so that the ball would be pulled up it rather than pulled away, and the doors pushing the ball onto the roller would have to be well synchronized. We decided to start from scratch, and came up with something we believe will work much better. So, yay! :)<br />
<br />
We're going to have a scoop with a slanted arm leading down, away from the robot. We will scoop the ball, then rotate the arm up so that it rolls down the arm into our collection box. This requires another motor with a decent amount of torque, and will likely be our main expense. Our collection box will be at least 6" above the ground. Our computer, battery, and orc board will live under the collection box. The collection box itself will be slanted towards a drawbridge on the back, which we will use a servo to open and close. With this design, we believe that building will be easier, ball collection will be more predictable, and we will now be able to score over the wall! <br />
<br />
We are currently working on a SolidWorks design. <br />
<br />
'''Other labwork''': We were able to successfully attach our new sensor to the orc board and calibrate it. (The distance is 16.93*output+1.624.) We decided that for consistency, we're going to be American and use inches. We got the robot to go up to a wall, sense it, stop, and turn. Thus earning Checkpoint 3! Yay! We started working on our image processing/camera, as well as making some cutouts to envision our robot better.<br />
<br />
'''Other''': Piper is no longer as sleep-deprived or as sick! Thus, the journaling is less scatterbrained. Bwaha. Shawn is somehow getting his thesis work done despite also doing Maslab! Xavier is sore from Ultimate frisbee (and has practice again tonight lol)! Michael is having trouble constantly switching between Mac OS, Ubuntu, and Windows.<br />
<br />
'''Other other''': Our print statement during testing was "Scully Scully Scully Scully Scully". This amused Michael greatly. Piper may or may not be obsessed with the X Files.<br />
<br />
<br />
<br />
'''[x] Possibly move wheels more towards the middle of our board. [we're probably going to with a two-wheel-two-caster operation<br />
<br />
'''[x] Figure out the torque needed on the new motor (weigh the balls?) [balls are super light]<br />
<br />
==January 6th, 2011==<br />
<br />
'''Lab''': Shawn and Xavier worked on robot design while Michael worked on the Checkoff 4 imaging problem. Piper joined Michael after the H-bridge tutorial. Also soldered a header together so that we could short our ESTOP in a less painful way :).<br />
<br />
'''Other''': Piper went to the H-bridge tutorial. Below are her random notes.<br />
<br />
* Current sense lines, always ground them. Both enable A and enable B should be tied to Vss. Input 1 and input 2 are tied to enable A and 3/4 are supplied to B. Orcboard output 5V needs to be connected to 5V line. Need to ground battery to ground and ground orcboard to ground. Supply voltage needs to be connected to the 12V line on battery.<br />
* Control-c might stop the code but not the motor. Make sure to clear the ports so that the motor stops in time.<br />
* Data sheet here: http://web.mit.edu/6.186/2011/Lectures/L298.pdf<br />
* There was an email sent out about shutdown hooks. We should look here: : http://download.oracle.com/javase/6/docs/api/java/lang/Runtime.html#addShutdownHook(java.lang.Thread)<br />
<br />
'''Other other''': <br />
<br />
* Michael and Piper have been arguing about who's a better Mexican for the past half hour.<br />
* And then a Spanish-speaking machine called Xavier's phone. He might be part of the Mexican mafia, and thus be most Mexican of all.<br />
* Robot beta plan (Piper's way): Robot is a giant skewer. Robot skewers other team's robot until dead. Robot then skewers own balls and tosses skewer onto other side. Profit!<br />
* Robot charlie plan (Shawn's way): robot is a giant vacuum. Robot vacuums up all balls. Robot fires all balls at the other team's robot. Profit!<br />
* Violence is the answer.<br />
* Robot delta plan (Shawn's other way): Build a plane. Collect balls upon takeoff. Dump on other side. Should be easy, right? :D<br />
* We were really happy to realize that our journal was long, thus writing the final paper will likely be super easy. Then we realized only about a quarter of the information here would actually be useful to our papers >.> :D<br />
<br />
<br />
'''[x] Figure out why our motors are so temperamental.<br />
<br />
'''[x] Decide where we want our sensors and camera.<br />
<br />
==January 7th, 2011==<br />
<br />
'''Labwork''': Our code can find red, blue, green, and yellow things! Yay, Checkoff Four! We also now have a full cardboard mockup of our robot, so we can start building. We're currently hard at work on the Checkoff Five code.<br />
<br />
'''Weekend plan''':<br />
* CODE CODE CODE<br />
* SolidWorks ? [decided not to do this because we have a physical model]<br />
<br />
'''Future plan''':<br />
* Use Monday's mock competition as a place to test our detection code and robot movement<br />
* Make robot parts and build this week!<br />
<br />
==January 10th, 2011==<br />
<br />
'''Lab''': SO MUCH CODING!!!!!! We did a lot of coding today, as well as revised our strategies for post-mock-competition. (Our robot isn't fully built yet, so our idealized code wouldn't have worked for this mock competition.) We ran into a lot of problems getting our camera to work :( Hopefully we'll be able to spend Tuesday-Thursday building and constructing for our robot, and testing our more finalized code by Thursday evening.<br />
<br />
'''Other''':<br />
* New robot strategy: Our robot will totally not be a trained puppy with an orcboard taped to it. It may ''look'' that way, but we promise the drool is purely synthetic.<br />
<br />
'''Other Other''':<br />
* "How can Farrah Fawcett die? She was so cool!"<br />
* Batman hasn't died because he is ''actually'' cool.<br />
<br />
<br />
'''[x] Figure out why our repository is sad. (We ended up not using the repository.)<br />
<br />
==January 11th, 2011==<br />
<br />
'''Lab''': EVEN MORE CODING OMGZ!!!! We also made some progress on making our design REAL! We have dimensions ready and hinges designed so that we can spend the rest of the week building, hoping to be done by Thursday evening. We got the camera to supply tracking! Now it accurately gives information to the controller and gives recommendations on what way the robot should turn.<br />
<br />
Michael then doubled the speed of image processing. We were doing the main process twice.<br />
'''Other''':<br />
* Beakman is better than Bill Nye, but no one ever saw Beakman :/<br />
* Xena was way more attractive than Bill Nye.<br />
* We have been in Maslab forever. <br />
* Piper hates computers. (But she really loves them.)<br />
* The passage of time is truly a mystery. "How did it 6pm?!"<br />
* Michael is obsessed by the song "Vagabond" from 500 Days of Summer (a movie that makes him cry).<br />
<br />
==January 12th, 2011==<br />
<br />
Snow day!!!! Michael did some angle from image data calculations and got stressed when he couldn't work it out so he came to annoy Piper.<br />
<br />
==January 13th, 2011==<br />
<br />
'''Lab''': Since lab was closed yesterday, we're now working on getting our pieces for the bot today. We decided to go with plexiglass, which is a bit more difficult to cut, so we've changed to lasering it. Our robot can now hopefully work its way ok around walls. Yay for metalwork and running to the hardware store to get hinges. (If only Economy Hardware weren't closed :(.)<br />
<br />
'''Other''':<br />
* Descent was the best game ever.<br />
* Scully is really, really short. Like, 5th grader short. And she's the same height as Piper.<br />
<br />
==January 14th, 2011==<br />
<br />
'''Lab''': After many interesting events last night/this week where the world just did NOT want us to build, we finally got our lasercutting-of-the-plexiglass done! Yay! We also made l-bars and other things to prepare for finally building this thing, which we will do tomorrow. We got checkoff 7 by discussing our current code as well as our plans for our robot design.<br />
<br />
'''Other''':<br />
* Shawn and Piper attended a wedding, and hence look incredibly snazzy today.<br />
<br />
== January 17th, 2011 ==<br />
<br />
'''Lab''': Our robot is finally a Real Thing now! It's assembled and shiny. We had to make some small modifications/redrilling. It looks like for our scoop, we'll be attaching the motors to string, having drawbridges for both our scoop and our open door. The Plexiglas might distort some of our images, so Michael will be writing code tonight to account for this. <br />
<br />
'''Other''':<br />
* Mystery Hunt was ''fantastic''! Xavier was on the winning team, and thus will be writing next year's Hunt! Shawn and Piper have finally thawed out from running around Harvard at 3am.<br />
<br />
== January 18th, 2011 ==<br />
<br />
'''Lab''': We can program our servo to do what we want! We also learned to use a lathe (at the Edgerton) , and thus made nifty pullies. We placed our center wedge in, and placed it such that balls will roll out of the opening. Our teeth for our scoop is complete.<br />
<br />
'''Other''':<br />
* We have figured out our strategy utilizing Portal guns. Unfortunately, GLaDOS does not exist. Yet.<br />
<br />
== January 19th, 2011 ==<br />
<br />
'''Lab''': Our robot is almost done! We currently have our teeth and servo taped to our robot instead of bolted in, our IR sensors and bump sensors need to be plugged in, attach string to our drawbridge motors, and rewire our drawbridge motors. We'll probably add a spacer so that our scoop doesn't encounter enough friction to stop our robot. This should all be doable within the next day or so. After that, it looks like we'll just have debugging ahead of us (haha, "just"). We need to recalibrate our colors and integrate our new sensors into the code.<br />
<br />
'''Other''':<br />
* Piper's love allows aliens to cure cancer. Do not question X Files logic.<br />
* Did we ever mention our most dastardly winning scheme? Teams don't seem to change their laptop passwords, so it'd be awfully easy to ssh in and...<br />
* Our robot's particularly ferocious appearance have led us to finally decide on a name... '''Steampunk Cthulhu'''. (But the OrcBoard's name is still scully.)<br />
<br />
== January 20th, 2011 ==<br />
<br />
'''Lab''': We are no longer using tape to keep our servo and teeth and various other parts together! We have lost our ghetto! We remachined a few parts and got some new ones (like our servo attachment) at Edgerton. Our servo is finally attached and done. Our drawbridge is ACTUALLY FUNCTIONAL!!!1!!eleventy!!! Our drawbridge uses pink strings, which only make Steampunk Cthulhu more fierce. It turns out gorilla glue is not sufficient for our teeth, so we're currently using tape. Yay, ghetto again!<br />
<br />
<br />
'''[x] Redo teeth?<br />
<br />
'''[x] Latch down collector plate so that the motors don't start drawing itself up instead of the scoop<br />
<br />
== January 24th, 2011 ==<br />
<br />
OH MAN LAST-MINUTE STUFF NOTHING COULD POSSIBLY GO WRONG!<br />
<br />
'''Lab''': Today's a day for odds and ends, in that the bulk of our robot is built and we're attaching our last (but crucial) bits. Our teeth are now bolted to our robot. Our collector plate will be bolted to the rest of our robot so that it does not raise when the scoop is supposed to raise.<br />
<br />
'''Other''': <br />
* IN THE NAME OF COMPASSION AND MEDICAL SCIENCE, I CAN SAVE MANY LIIIIIIVVESSSS IF YOU GIVE ME ONE MAN. (Jekyll & Hyde is coming to MTG soon!)<br />
<br />
'''[x] Attach bump sensors and IR to OrcBoard {though no longer using IR sensors}<br />
<br />
''[x] Do we want to play with a gyro? {No}<br />
<br />
''[x] Maybe something for our balls to slide off of when released, to make sure they go over the wall<br />
<br />
''[x] Scotch tape to smooth out teeth (or a stop for our scoop? {no})<br />
<br />
== January 25-27th, 2011 ==<br />
<br />
'''Lab''': CODE CODE CODE CODE c'mon Mrs. Cthulhu, why don't you work?? ;_;<br />
<br />
<br />
'''Other''':<br />
* JOSIE AND THE PUSSSYCATTSSSSS<br />
<br />
"What are you guys doing?" <br />
"We're spoiling Star Wars for wings again."<br />
"What? I already know what happens. That one dude is the guy's father and everything is fine."<br />
"Just like that, huh?"<br />
<br />
<br />
'''[ ] Make sure to tighten all our bolts before we turn in our robot<br />
<br />
''[ ] String/motor relationship should be improved for consistent winding.<br />
<br />
<br />
== Additional Notes ==<br />
<br />
<br />
Notes From Xavier <br />
<br />
<br />
'''tldr for teams that want to know what we learned.'''<br />
<br />
Our team did what we came to maslab to do. Everyone had their own goals in mind and I think that we all ended up getting more than we were both asking or hoping for, One thing that we did not do that I imagine there have been heated discussions about is defined roles. This is only in the sense that we did not tell anyone "This is your job, you must do this." It was more of a "Thank you for picking this thing to do to help the team."<br />
<br />
<br />
This worked out really well in all but one instance and that was with respect to our code. The roles that we intended to have were split into two groups, building and coding. Our two strongest coders would write and handle the code and everyone else got to build. What developed was a complex hierarchy of task management where I did most of the building after discussing the design with Shawn and Wings would keep us on track by not only updating our list of things we planned to do and checking that we had done them but also writing our journal entries and eventually contributing heavily to the final paper Wings and Shawn would also solder and construct other parts that would be used to build the robot. This system worked great for the part of the team that used it. We would talk to each other, sanity check all of our ideas, and even propose alternative ideas of varying quality so that we could come to an agreement on what a good solution would be. This did not happen for our code and that was our downfall.<br />
Before I potentially offend Michael it should be noted that he was perfectly competent, friendly, and wrote a lot of time consuming code, all the while looking for more ways in which he could help the team. There was simply a disconnect between communicating how the code acted and what the code needed. The rest of the tea was always aware of what code had been written, such as the state machine or the vision processing code. What we did not know is how it intended to handle the information that it got. This made Michael too important to the team and the only member of the team that if the team lost the others would not be able to function productively and when he got sick during the final week of IAP it brought our progress to zero until he felt well enough to write more code (which he proceeded to do for > 24 of the next 32 hours.) During this time we finally got a chance to really debug the code for the first time and a large percentage of the issues we were dealing with could have been dealt with previously if we had been having productive conversations about the cod and also if Michael had some for of physical thing he could have used to test with. Overall I am glad that we got through maslab together, we would most certainly do better if we knew what we now know, and I hope that if you skipped to the end you take the time to have more than a single person who understands and can work on the code.<br />
<br />
== Competition Day ==<br />
<br />
YAYYYYY it's done!<br />
<br />
Shawn, Xavier, and Michael had a running bet on how many times Wings would say "ponies" in the course of the month. Shawn won at around 50 ponies. Wings was incredibly amused to find this out.</div>Wingshttps://maslab.mit.edu/2011/wiki/Team_OneTeam One2011-01-03T17:58:42Z<p>Andreali: </p>
<hr />
<div>Maslab team 1<br />
<br />
Note: Our team name is part of a game. The game's objective is to come up with the most bizarre/awesome/awkward team name EVER!<br />
<br />
List of names thus far:<br />
<br />
* Sheila [Allan]<br />
* Troll Ex Machina [Erons]<br />
* Ghost of the Navigator [Erons]<br />
* Ghost of the Captivator [Erons]<br />
* Infidel Castro Motor Oil [Erons]<br />
* S.T.R.O.N.G. A.R.M. [For Muth]<br />
* The Pink Yeast Beast from the East [Red]<br />
* ROFLCOPTER [Erons]<br />
* Instruments of Pain [Erons]<br />
* if(config){18.111.97.222} [Andrea]</div>Andrealihttps://maslab.mit.edu/2011/wiki/Team_One/JournalTeam One/Journal2011-01-03T17:57:21Z<p>Andreali: </p>
<hr />
<div>'''Day 0 (Jan 02):'''<br />
<br />
-Discussion over Skype of strategy, organization and task allocation. (E,C,Al,M)<br />
<br />
'''Day 1 (Jan 03):'''<br />
<br />
*Lectures 0, 1 and 2 were attended. (C,Al,An)<br />
*Built the peg bot with two casters<br />
*Sent messages from the eeePC to Allen's laptop<br />
*Made robot drive forward to fulfill checkpoint 1<br />
<br />
'''Day 2 (Jan 04)<br />
<br />
*Andrea attended laser cutter training at Edgerton<br />
*Attended lecture (C, Al, An)<br />
*Discussed our schedule, strategy, code architecture, mechanical design with Ellen for checkoff 2<br />
*Cory attended general shop training at Edgerton<br />
*Allan worked on vision code<br />
*Andrea worked on sensor implementation<br />
*Cory and Erons began prototypes of the shooter and roller component. We decided to work in Random's shop and EE lab for the evening because we had ready access to various odds and ends to make our prototypes out of.<br />
*Andrea and Allan survived a day with each other coding.<br />
<br />
[[Image:AndreaAllan.jpg]]<br />
<br />
''' Day 3 of captivity (Jan 05)<br />
*Captors have completed checkpoint three last night<br />
*Botclient continues to provide problems, will address<br />
*Decided to re-porpoise robot to kill captors in sleep<br />
*Matt has returned from the sunny west!<br />
*Cory slept through lecture on accident because taking the red-eye back to boston from California destroyed his sleep cycle. He apologizes.<br />
*Replaced left motor<br />
*Turned in checkpoint three<br />
*Purchased materials for the robot<br />
*Set up git (Thanks Matt)<br />
*Erons found a potential candidate for a shooter motor underneath his bed<br />
[[Image:What9000.jpg|frame|none|alt=WHAT 9000?!?| *This motor is rated at 25,500 RPM at 7.2Volts. Our design only requires 2000 RPM to launch balls about 5 metres]]<br />
<br />
''' Day 4 of captivity (Jan 06)<br />
*Captors made me identify a ball<br />
*Weapon system (shooter) prototyped, the fools plan to arm me<br />
*Structure CAD work well under way, should be finished soon<br />
*Wander code began<br />
<br />
''' Day 5 of captivity (Jan 07)<br />
*CADs being finalized for laser cutting on Saturday and Sunday<br />
*Worked on the interface between the high RPM motors and the rollers<br />
*Worked on PD code<br />
<br />
'''Day 6-7'''<br />
*Light workload for the weekend because going full throttle for the entire month would kill us. Coders programmed a little, meches refined the design in anticipation of laser cutting and waited for parrts to arrive.<br />
<br />
''' Day 9 of captivity (Jan 11)<br />
*My battle armor has been laser cut, assembly in progress (waiting for parts in the mail<br />
*PID controller for ball targeting complete<br />
*Vision code being reworked<br />
*Ball conveyor being being constructed<br />
<br />
'''Days 10-12'''<br />
*Cory editing here. I have grown bored of the captivity joke that erons was running. I am writing these journal entries retroactively. We did not sleep properly in this time window. Dates lost all meaning. <br />
*The mechEs put together the laser cut bits. the force of friction alone turned out to be enough to hold the pieces of plastic together for testing purposes, so we didn't bolt anything. <br />
*The roller functions but needs streamlined before competition.<br />
*In an attempt to reduce friction in the shooter and roller, we inserted bushings into the holes. HOWEVER,in doing so, we both made the bot ugly and (in the shooter's case) threw our carefully machined holes out of whack. parts of the robot will probably be re-cut.<br />
*lots of work was done at weird times of day at MITers. Holy crap people are seriously there at all times of day.<br />
*The coders continued developing their respective code sections. Matt did vision, Allan's doing PID and i think Andrea was doing the wandering/ball-searching algorithm <br />
<br />
''' Day 13 (Jan 15)<br />
<br />
[[Image:RoboRage.png]]<br />
<br />
<br />
<br />
'''Weekend - 16th and 17th'''<br />
*Due to a combination of crappy sleep cycles and other things, we took a light weekend. The coders either rested or went to participate in Mystery Hunt. Meches re-CAD-ed.<br />
<br />
<br />
'''MLK Jr Day:'''<br />
*The replacement laptop appears to have various issues that the coders have been dealing with while trying to test varying code snippets. Some bin files were messed up and things like that. Ideally one of them will update this journal entry with specifics.<br />
*The new parts were re-laser-cut, with extra allowances for things like bushings and the electronics. Also the back plate now features a sweet phoenix-looking thing that was painted on the wall in Allan's room by a previous resident.<br />
<br />
<br />
'''Jan 24:'''<br />
*The issues with the drive motors have been fixed. Our shooter is being replaced with a servo dumping gate<br />
*Ball rollers are still flaky, but being updated and worked on<br />
*On the new robot: wall fol following +ball gathering + wall acquisition is all checked and works<br />
*Magnetometer code is being written. Here is an image of the...errr ''fabulous'' data we have:<br />
[[Image: MagnitomiterDataBad.jpeg]]<br />
[[Image: MagnitomiterDataBad2.jpeg]]<br />
<br />
<br />
'''Jan 25:'''<br />
*The Magnetometer is working! Look at the beautiful data. <br />
[[Image: MagnitomiterDataGood.jpeg]]</div>Andrealihttps://maslab.mit.edu/2011/wiki/Team_Nine/JournalTeam Nine/Journal2011-01-03T17:52:02Z<p>Cdville: /* 1/12/11 */</p>
<hr />
<div>VEGA AND THE VOLTRONS<br />
<br />
Black lion is go.<br />
<br />
Yellow lion is go.<br />
<br />
Blue lion is go.<br />
<br />
Red Lion is go.<br />
<br />
Green lion is go.<br />
<br />
Ready to form Voltron! Activate interlocks! Dyna-therms connected. Infra-cells up; mega-thrusters are go! Let's go, Voltron Force!<br />
<br />
<br />
== 1/3/11 ==<br />
<br />
Blue lion: We could only get the Hello world program to run if BotClient was running on BOTH the eeePC and the laptop. Unsure why, but hey, it works.<br />
<br />
Yellow lion: Yes, the Hello World program would not operate with a TextChannel alone. We ended up needing a BotClient running on both ends in order to establish a connection.<br />
<br />
Green lion: Today, 1/3/11, our pride was split into three. After lecture, both the blue and yellow lion focused on coding, and focused they were indeed. The hello world program on botclient didn't take much time, but talking to the orc board and running the motors posed a serious problem. Luckily the problem seemed to be common among the other teams and it was resolved once everyone realized that it was the new orc board's default ESTOP connection that was defective. The red lion and myself dealt with connecting the uorc, via soldering, to the various components. Outside of the fact that we soldered directly to our orc board at first, everything turned out functional. Our team leader, the black lion, finished mounting the motor brackets and then left us to fight for ourselves in the jungle that is MASLAB! He eventually joined us again and we finally conglomerated to discuss our strategy to protect the universe and compete in the final competition. Solid first day.<br />
<br />
Red lion: Still 1/3/11, we had a meeting at Voltron headquarters to discuss strategy and design. First we ordered alot of pizza and had a Voltron: Lion Force marathon. After that we got to work. Blue lion was coding nonstop to develop our AI (read: playing video games) and Yellow lion was hard at work designing our camera interface (read: sleeping). We have agreed to go for what we originally planned before IAP, which is a bot that will exclusively go for scoring balls over the yellow dividing wall. We feel this not only maximizes our possible score each round, but also allows us to defend against other robots dumping balls into our half of the field. This way we dont need to discern the colors of our team's ball vs the other teams ball, we just pick up any ball we encounter and put it over the wall. In terms of our structural design, we have decided to use a combination of rollers. We'll have a wide opening in front with a rubber band roller to gather the balls. Then we'll have a large drum roller behind a collecting area for the balls. This drum will carry the balls from the ground level collecting bay to an elevated collecting bay that is slanted downwards towards the front of the bot. Both of these rollers will continuously turn throughout the round to continuously collect balls. The front of our top collecting bay will have a door, hinged on the bottom, about 6" tall. This will lower via a servo to release the balls when we encounter the yellow dividing walls, then tilt back upwards to contain any more balls collected. The one control on this ball collecting and lifting mechanism will be two break sensors, one in between the front and rear rollers, and one right before the top collection bay. This way we know if each ball makes it up to the top, and if the bottom sensor is tripped without the top being tripped soon afterwards, we can reverse the rollers and undo the jam. We also decided that our camera will be front mounted, and that on the rear we will have a marker dropping system. The marker will be orange or purple and tall, and the bot will place it on the ground when it is near the yellow dividing wall. This way, wherever the bot is on the field, if it wants to return to the wall to score, it needs only to locate the marker over all the other walls and find its way back to the marker. For a driving system, we definitely want the ability to strafe and also the ability to rotate around the central axis. Therefore we decided to use omniwheels. If we can find mecanum wheels that are cheap enough, we want to use those since they will be easier to mount, and have a pretty significant swag-factor. If we cant though, we will use 4 omniwheels, one at each corner, mounted 45 degrees to the walls of the robot, to give us the same abilities. Our eeePC will be mounted in a slot directly above the bottom (floor level) ball collecting bay, and below the top bay. This way our center of gravity will still be low, and we like the sound of that. We may also have a trackball mouse for odometry. We like the idea of a trackball better than optical because we feel that will be better for rolling. We aren't quite sure yet if its needed though. As space explorers, we have also found way to combat the evil forces of the universe. We will have a tall antenna with a 360 degree array of IR beams. These will shine out to confuse/disable the other team's IR sensors, while at the same time being far enough away from the robot to not interfere with our own IR sensors (which will be mounted on the front to aid in wall locating and following). Green voltron is currently drawing our first sketch. Black voltron is making our schedule. Blue voltron is doing something productive. Yellow lion is still sleeping 2 hours later.<br />
<br />
<br />
Black lion: Kept everyone in line.<br />
<br />
== 1/4/11 ==<br />
<br />
Blue lion: I've noticed that both green and red lions are having issues with telling time as they believe it was Sunday yesterday. Besides that, we've finalized the design "structure" for our programming and AI. That is currently in a word document in my possession. Black lion has reviewed it and approves. I am working on getting everyone setup and instructed in SVN + organizing our Java structure. Update: SVN is up and running with vision code as well.<br />
<br />
Black lion: We made a lot of design progress today. We've divided up tasks for coding, and started the CAD process. I'd estimate we're about a quarter done with CAD; we've got a number of individual components done, but little integration. Still, what we've done has already led us to a number of design decisions, and we're on pace with our schedule.<br />
<br />
== 1/5/11 ==<br />
Yellow lion: Code for Checkpoint 3 is complete. Our team is now shifting focus on dividing up the labor and getting ready for an all out blitz of work to get this robot in working order as soon as possible. We're going to start in on Checkpoint 4 and hopefully we'll actually be able to finish that by today. It seems much more difficult than Checkpoint 3 but we already have some vision code written which should help a great deal.<br />
<br />
== 1/6/11 ==<br />
Yellow lion: Went to the H Bridge tutorial and the SSH tutorial. Both were extremely helpful. May have to ask about speed controlling a motor from an H Bridge.<br />
<br />
Green Lion: Spent a lot of time time solidifying our mechanical and structural design. Got 'solid work' done CADing today, so now the red lion and I hope to cut out a few of our major structural foundations from acrylic tomorrow. Our robot is slowly coming to life.<br />
<br />
Blue Lion: Went to SSH tutorial. The vision code gave us serious problems, despite being almost entirely working. Black Lion eventually figured it out for Checkpoint 4. Serious discussion with Green, Yellow, and Red Lions about the AI strategy.<br />
<br />
== 1/7/11 ==<br />
Blue Lion: Finally implemented a fix for the BotClient needing to be ran on both computers. (The entire issue was the fact the code terminated on the eeePC, which I realized a while back. A simple thread.sleep(50000) works as a fix for now.)<br />
<br />
== 1/9/11 ==<br />
Yellow Lion: "I look like Claus Valca" Also the robot is coming along nicely.<br />
<br />
Red Lion: Yo check it, build team (me and black and green lions) have been CADing all day. Like for real so much CADing. We have enough of a robot now that if someone were to look at it they would recognize it as a robot. But its still not enough detail to actually start laser cutting anything so we're gonna have another CAD marathon tomorrow. We'll try to get some sketches as pictures up on the wiki finally. The coders are still doin their thang.<br />
<br />
== 1/10/11 ==<br />
Red Lion: The virtual robot is finally starting to take place in detail. We've switched the design to a 2 floor idea, and have decided to use solenoids instead of servos since they are less sensor points. We are hoping for the model to be done either today or tomorrow, so then we can get to cutting everything and ordering the parts. This way while the programmers finish up their code in the week to come, we can just put everything together.<br />
<br />
Yellow Lion: 0% pizza was attained today. We are deeply saddened, but will persevere through these tough times nonetheless.<br />
<br />
== 1/12/11 ==<br />
Yellow Lion: Still working on the sensor and local map construction code. It seems to be coming along nicely although much testing is needed. A complete solid model of our robot is done and we will start cutting pieces very soon. <br />
<br />
P.S.<br />
<br />
[[Image:Votronteampic.jpg |thumb|1300px|Glorious]]<br />
<br />
<br />
== 1/15/11 ==<br />
Blue Lion: AI design strategy finalized.</div>Blacklionhttps://maslab.mit.edu/2011/wiki/Team_NineTeam Nine2011-01-03T17:49:15Z<p>Yellow Lion: </p>
<hr />
<div>== VEGA AND THE VOLTRONS ==<br />
<br />
Five Galaxy Alliance space explorers named Keith, Lance, Pidge, Hunk and Sven come upon Planet Arus which has been ravaged by the evil King Zarkon's forces. When they arrive, they are captured and taken to the Planet Doom, where they are imprisoned. After their escape, they head back to Arus. After making it to the ruins of the Castle of Lions, they convince the beautiful Princess Allura and her advisor, Coran, to allow them to seek out the five Lions of Voltron. From then on, they are known as the Voltron Force and eventually succeed in driving Zarkon's forces back with Voltron's help. Only together can the five space explorers triumph over the evil Kind Zarkon and emerge as champions of MASLAB 2011.<br />
<br />
== Team Members ==<br />
<br />
Will "Keith" Vega-Brown - Black Lion<br />
<br />
Billy "Lance" Thalheimer - Red Lion<br />
<br />
Aaron "Hunk" Prindle - Yellow Lion<br />
<br />
Joel "Pidge" Santisteban - Green Lion<br />
<br />
Chris "Sven" Dessonville - Blue Lion<br />
<br />
[[Image:VegaAndTheVoltrons.png|thumb|250px|Greatness]]</div>Red Lionhttps://maslab.mit.edu/2011/wiki/Tools_and_Stock_RequestTools and Stock Request2011-01-03T02:29:41Z<p>Yichen: /* Staff Parts */</p>
<hr />
<div>Request tools and other stock materials here. Supply as much detail as possible including your name.<br />
<br />
==Staff Parts==<br />
* Red Paint / Ellen yichen@mit.edu <font color="red">(purchased)</font><br />
* Green Paint / Ellen yichen@mit.edu <font color="red">(purchased)</font><br />
* Yellow Paint / Ellen yichen@mit.edu <font color="red">(purchased)</font><br />
* Tower Material / Ellen <font color="red">(purchased by Eric)</font><br />
* White Paint / Ellen yichen@mit.edu <br />
* Dark Blue Carpet Tape / Ellen<br />
* White Tape for walls / Ellen<br />
* 6 MiniDV tapes for final competition / Ellen<br />
<br />
==Tools==<br />
* Calipers / Team 3 <font color="red">(purchased)</font><br />
* Alan Wrenches <font color="red">(purchased)</font><br />
* C-clip pliers <font color="red">(purchased)</font><br />
<br />
==Mechanical==<br />
* Acrylic Sheet / Ellen <font color="red">(purchased by Eric)</font><br />
* Scroll Saw blades / Ellen<br />
* Zip Ties / Ellen <font color="red">(purchased by Eric)</font><br />
* Shears / Ellen<br />
* Hand saw blades / Ellen<br />
<br />
==Electrical==<br />
* Black standard electrical tape. / Submitted by team 3 <font color="red">(purchased by Eric)</font><br />
* 220 ohm resistors 1206 or smaller / Ellen <font color="red">(purchased by Eric)</font><br />
* 3-pin female header / Ellen <font color="red">(purchased by Eric)</font><br />
* Additional multi-pin header / Ellen<font color="red">(purchased by Eric)</font></div>Yichenhttps://maslab.mit.edu/2011/wiki/Battery_ConnectionBattery Connection2011-01-02T22:14:13Z<p>Powerss: </p>
<hr />
<div>== Layout ==<br />
<br />
When looking at the plastic piece that connects the battery to the μOrc with the clip side up and the larger holes facing to the left, the black wire should go on the bottom and the red wire (with the fuse) should go on top.<br />
<br />
[[File:Sample_battery_connector.JPG]]</div>Powersshttps://maslab.mit.edu/2011/wiki/Team_JournalsTeam Journals2010-12-28T03:28:08Z<p>Yichen: Created page with "Each team should make a short entry in their '''team journal''' for each day of IAP. Include what you accomplished, outstanding issues, and any planning you did (all of this nee..."</p>
<hr />
<div>Each team should make a short entry in their '''team journal''' for each day of IAP. Include what you accomplished, outstanding issues, and any planning you did (all of this need not be more than 3–5 sentences). The journal serves several purposes:<br />
* It allows teams to reflect on their accomplishments and develop short and long-term goals.<br />
* It helps staff members to track teams' progress. (Note: if you have a specific issue that a staff member should address, send an email as well as including it in your journal.)<br />
* It preserves what you've learned from your experience, for the benefit of other teams in both this year and future years. (Note: if this knowledge includes fixes to important bugs, you should also email the staff and post your fix to the [[student tips]] page.)<br />
Journal entries need not be formal, but they should be informative, clear, and concise. All journal entries will be checked by the staff before noon on the following day.<br />
<br />
To create or edit your team journal, click on its link from the [[Main Page]].</div>Yichenhttps://maslab.mit.edu/2011/wiki/Lab_CleanupLab Cleanup2010-12-28T03:26:26Z<p>Yichen: Created page with "Each team has been assigned a day in which they are to help clean up lab at the end of the day. The assignments are listed on the Calendar. If you can't make your assigned ..."</p>
<hr />
<div>Each team has been assigned a day in which they are to help clean up lab at the end of the day. The assignments are listed on the [[Calendar]]. If you can't make your assigned shift, it is your responsibility to trade with another team. If that is not possible or if you cannot make the end-of-Maslab cleaning, contact the staff to discuss how to make up your missed shift.<br />
<br />
We also require all teams to show up at noon on the Saturday after IAP ends to help restore lab to the pristine state we got it in. At that time, we require you to return all Maslab-owned equipment that you have, such as the computer from your kit. Cleaning up lab after the end of Maslab is extremely important. If you fail to show up for end-of-lab cleaning (and don't make this up) or fail to return our property, we will fail you.</div>Yichenhttps://maslab.mit.edu/2011/wiki/Orc_LayoutOrc Layout2010-12-28T00:53:32Z<p>Yichen: Created page with "File:uORC2011.jpg"</p>
<hr />
<div>[[File:uORC2011.jpg]]</div>Yichenhttps://maslab.mit.edu/2011/wiki/SubversionSubversion2010-12-28T00:04:48Z<p>Yichen: Created page with "'''[http://subversion.tigris.org/ Subversion]''' is a version control system (like CVS, only without a decade of cruft). Its main advantage is the fact that it tracks changes..."</p>
<hr />
<div>'''[http://subversion.tigris.org/ Subversion]''' is a version control system (like [[CVS]], only without a decade of cruft). Its main advantage is the fact that it tracks changes to your source code base as a whole together, not to individual files.<br />
<br />
== Using SVN ==<br />
<br />
* To check out a working copy of the project from your team's repository, see [[Repositories]].<br />
* To update your (already checked out) archive: type <code>svn update</code>. This updates to the latest version in the repository, letting you know if you've made any incompatible changes locally. <br />
* To commit any changes you've made (in all files): <code>svn commit</code>.<br />
* To see what files have been changed: <code>svn status</code> (<code>M</code> means modified).<br />
* To add a new file to be version controlled: make the file like usual, then do <code>svn add ''file''</code>.<br />
* To copy or rename a file: <code>svn cp ''file1'' ''file2''</code> or <code>svn mv ''file1'' ''file2''</code>. (If you don't use svn here, it won't know that the file has moved!)<br />
* To get help: <code>svn help</code> or <code>svn ''subcommand'' help</code>.<br />
<br />
Full documentation: [http://svnbook.org/ The Subversion Book]<br />
<br />
== Using SVN with Eclipse ==<br />
<br />
[http://subclipse.tigris.org/install.html Installing Subclipse]</div>Yichenhttps://maslab.mit.edu/2011/wiki/SSHSSH2010-12-28T00:04:04Z<p>Yichen: Created page with "What does it mean when we say "'''SSH''' into your robot"? Say your eeePC reports an address of <code>18.111.0.300</code>. Just type <code>ssh maslab@18.111.0.300</code> at an At..."</p>
<hr />
<div>What does it mean when we say "'''SSH''' into your robot"? Say your eeePC reports an address of <code>18.111.0.300</code>. Just type <code>ssh maslab@18.111.0.300</code> at an Athena prompt. You'll be prompted for the password, and then you can run remote commands on your robot.<br />
<br />
== Passwordless login ==<br />
To set up passwordless login (which you'll need unless you want to type your password every time you do <code>[[ant]] upload</code>), do this:<br />
<br />
* On Athena, do <code>ssh-keygen -t rsa</code>. When you're prompted for a file name, accept the default. Leave a blank password. <br />
* Copy your public key to the eeePC like this (use your robot's correct IP address): <code>scp ~/.ssh/id_rsa.pub maslab@18.111.0.300:/home/maslab</code><br />
* SSH to the eeePC: <code>ssh maslab@18.111.0.300</code><br />
* On the eeePC, make a <code>.ssh</code> directory if it doesn't already exist: <code>mkdir .ssh</code><br />
* Set restrictive permission on it (SSH requires this): <code>chmod 700 .ssh</code><br />
* Add your public key to the authorized keys file: <code>cat id_rsa.pub >>.ssh/authorized_keys2</code><br />
* Set restrictive permission on this file, too: <code>chmod 600 .ssh/authorized_keys2</code><br />
* Now when you type <code>ssh maslab@18.111.0.300</code> on Athena, you should be logged in automatically with no password prompt.<br />
<br />
== Copying files ==<br />
To copy files to your robot, do <code>scp ''files'' maslab@18.111.0.300:''/destination/directory''</code>. If passwordless login is configured, <code>scp</code> will also use it.</div>Yichenhttps://maslab.mit.edu/2011/wiki/EmacsEmacs2010-12-28T00:03:38Z<p>Yichen: Created page with "== Tips == To turn on syntax highlighting, parentheses matching, and region highlighting, add these lines to your <code>~/.emacs</code> file: (global-font-lock-mode 1) (show-p..."</p>
<hr />
<div>== Tips ==<br />
<br />
To turn on syntax highlighting, parentheses matching, and region highlighting, add these lines to your <code>~/.emacs</code> file:<br />
(global-font-lock-mode 1)<br />
(show-paren-mode 1)<br />
(transient-mark-mode 1)<br />
This makes writing Java in Emacs more convenient.</div>Yichenhttps://maslab.mit.edu/2011/wiki/CVSCVS2010-12-28T00:03:09Z<p>Yichen: Created page with "''Note that Subversion is a more powerful alternative to CVS; you should use it instead unless you have a good reason.'' == Setting up a CVS respository == * To set up a '..."</p>
<hr />
<div>''Note that [[Subversion]] is a more powerful alternative to CVS; you should use it instead unless you have a good reason.''<br />
<br />
== Setting up a CVS respository == <br />
<br />
* To set up a '''CVS''' repository for your team in your Athena locker:<br />
*# <code>mkdir ~/maslabcvs</code> (you can name it whatever you want) <br />
*# <code>fs setacl -dir ~/maslabcvs -acl system:maslab-2009-team-''N'' all</code> (where ''N'' is your team number; see [[Athena lists]])<br />
*# <code>setenv CVSROOT $HOME/maslabcvs</code><br />
*# <code>cvs init</code><br />
*# Change to the directory that contains your code. Make sure it contains only files/directories that you want to put in the repository. <br />
*# <code>cvs import ''modulename'' init head</code> (where <code>''modulename''</code> is a name you pick)<br />
* If one of your teammates has set up a repository for your team, do this:<br />
*# <code>attach ''teammate''</code> (where <code>''teammate''</code> is the username of the person who made the repository) <br />
*# <code>setenv CVSROOT /mit/''teammate''/maslabcvs</code><br />
*# If you put the previous two lines in your [[Athena settings|<code>.environment</code> file]], you won't ever need to type them again. You might also want to set <code>CVS_EDITOR</code> to your favorite text editor. It will be used for recording log messages when you commit.<br />
<br />
== Using CVS ==<br />
* To checkout a copy of the code: <code>cvs co ''modulename''</code>.<br />
* To update your copy to the latest version: <code>cvs up</code>.<br />
* To commit your changes: <code>cvs commit</code> (you must update before you can commit).</div>Yichenhttps://maslab.mit.edu/2011/wiki/Athena_settingsAthena settings2010-12-28T00:02:45Z<p>Yichen: Created page with "Most settings in Athena are controlled by your ''dotfiles'', files located in your home directory that begin with a "<code>.</code>". They're hidden by default; to see them, type..."</p>
<hr />
<div>Most settings in Athena are controlled by your ''dotfiles'', files located in your home directory that begin with a "<code>.</code>". They're hidden by default; to see them, type <code>ls -a</code>. By adding the following to your <code>~/.environment</code> file, you can avoid typing additional commands at startup.<br />
<br />
# From your home directory, type <code>emacs .environment</code> to edit your <code>.environment</code> file using [[Emacs]] (or use favorite text editor).<br />
# Add the following lines:<br />
#* <code>add java</code><br />
#* <code>add sipb</code> (needed for [[Ant]])<br />
#* <code>add 6.186</code> (needed for the Maslab classes and [[BotClient]])<br />
#* <code>setenv CLASSPATH /mit/6.186/2009/maslab.jar:.</code> (needed for Java to compile the Maslab classes)<br />
# Type Ctrl-X Ctrl-S, Ctrl-X Ctrl-C to save and close (if using Emacs).<br />
<br />
(If you don't have a <code>.environment</code> file in your home directory, the steps above will create one for you.)</div>Yichenhttps://maslab.mit.edu/2011/wiki/AntAnt2010-12-28T00:01:59Z<p>Yichen: Created page with "'''[http://ant.apache.org/ Ant]''' is a tool for automatically building your code. '''NOTE: This assumes you have ssh and rsync installed, Windows users will most likely need to..."</p>
<hr />
<div>'''[http://ant.apache.org/ Ant]''' is a tool for automatically building your code.<br />
<br />
'''NOTE: This assumes you have ssh and rsync installed, Windows users will most likely need to install Cygwin'''<br />
# Put <code>[[build.xml]]</code> in the directory with your source. <br />
# To build your code just type <code>ant</code>. Ant will try to be intelligent about which files need to be recompiled based on the changes you have made. <br />
#* Sometimes you need to rebuild the whole tree. In that case, type <code>ant clean</code> to delete all your classfiles, and then type <code>ant</code> to rebuild.<br />
#* Ant creates a directory named <code>depcache</code>; this is a dependency cache that helps it build faster. You can safely delete it, but your next build will be slightly slower as it is recreated.<br />
<br />
Ant can also automatically upload your files to the eeePC. Open <code>build.xml</code> and set the appropriate IP address (<code>robotIP</code>) and destination directory (<code>destDir</code>). Then typing <code>ant upload</code> will build all your software and then upload it to the eeePC. You should set up [[SSH|passwordless login]] to avoid typing your password every time.<br />
<br />
Ant is in the <code>sipb</code> locker (<code>add sipb</code>).</div>Yichenhttps://maslab.mit.edu/2011/wiki/SoftwareSoftware2010-12-27T23:57:07Z<p>Yichen: /* Maslab Software Distribution */</p>
<hr />
<div>== General API Information ==<br />
* [http://java.sun.com/javase/6/docs/api/ Java 1.6 API]: documentation on the standard Java classes. <br />
* [http://web.mit.edu/6.186/2011/doc/maslab/api/ Maslab Java API 2011]: documentation on the Maslab custom software.<br />
* [http://web.mit.edu/6.186/2011/doc/uorc/api/ OrcBoard Java API]: documentation on the OrcBoard library.<br />
* [http://math.nist.gov/javanumerics/jama/doc/ Jama numerics library API]: documentation on the Jama numerics library.<br />
* [http://java.sun.com/docs/books/tutorial/ Sun's Java tutorials]: online tutorials in many Java topics. <br />
* [http://maslab.lcs.mit.edu/2004/lectures/javareference.txt Ed Faulkner's Java quick reference]: examples of all the important language features.<br />
<br />
== Application Information ==<br />
* [[Ant]]: A tool that can automatically build and upload your code. Use it, it's nice. <br />
* [[Athena settings]]: Things you should set in your athena locker to make development easier. <br />
* [[BotClient]]: The tool for viewing remote channel data. <br />
* [[CVS]]: A tool for managing your source code amongst your teammates (but see [[Subversion]] instead).<br />
* [[Emacs]]: Some notes on getting the most out of the GNU Emacs text editor. <br />
* [[SSH]]: Tool for logging into other machines remotely. Also has info on <code>scp</code>, which copies files to and from other machines. <br />
* [[Subversion]]: A tool for managing your source code amongst your teammates.<br />
<br />
== Maslab Software Distribution ==<br />
<br />
We're running [http://www.ubuntulinux.org/ Ubuntu 10.10 Linux] (a Debian-based distribution).<br />
<br />
Your EEEPC will have maslab.jar and orc.jar installed. <br />
If you are developing code on your personal computer, you will need to download both of these files.<br />
<br />
You will need to download the current versions of maslab.jar and orc.jar.<br />
<br />
NOTE: You will need to download both files and place them in /usr/share/java/ You will need admin privileges to do this see <code>man sudo</code>.<br />
<br />
You will then need to add these to your class path. You can do this by adding <code>export CLASSPATH=/usr/share/java/orc.jar:/usr/share/java/maslab.jar:</code> to your .bash_profile file.<br />
<br />
<code>echo export CLASSPATH=/usr/share/java/orc.jar:/usr/share/java/maslab.jar: >> .bash_profile</code><br />
<br />
<code>echo source .bash_profile >> .bashrc</code><br />
<br />
The freshest version of the files is also available on Athena at <code>/mit/6.186/2011/orc.jar</code> and <code>/mit/6.186/2011/maslab.jar</code>.<br />
<br />
[http://web.mit.edu/6.186/2011/maslab.jar Download maslab.jar]<br />
<br />
[http://web.mit.edu/6.186/2011/orc.jar Download orc.jar]</div>Yichenhttps://maslab.mit.edu/2011/wiki/Lecture_ListLecture List2010-12-27T22:51:08Z<p>Yichen: </p>
<hr />
<div>{| border="1" cellpadding="4" cellspacing="0" style="border: 1px solid #000000; border-collapse: collapse;"<br />
|- style="text-align: center; background: #666666; color: #FFFFFF"<br />
! Lecture Topic<br />
! Date and Time<br />
! Professor<br />
|- <br />
| Introduction<br />
| 1/3 at noon<br />
| Ellen Chen<br />
|- <br />
| Sensors<br />
| 1/3 <br />
| Arthur Petron<br />
|-<br />
| Strategy and Mechanical<br />
| 1/3 <br />
| Sam Powers<br />
|-<br />
| Behavior<br />
| 1/4 at noon<br />
| Buro Mookerji<br />
|-<br />
| Software Architecture/Threading<br />
| 1/4<br />
| Geza Kovacs<br />
|-<br />
| Controls<br />
| 1/5 at noon<br />
| Ellen Chen<br />
|-<br />
| Vision<br />
| 1/5<br />
| Geza Kovacs<br />
|-<br />
| Navigation<br />
| 1/6<br />
| TBD<br />
|-<br />
| Guest Lecture 1: Entrepreneurship<br />
| 1/6 at 2pm<br />
| Mina Hsiang<br />
|-<br />
|-<br />
| Guest Lecture 2<br />
| 1/11 at 1pm<br />
| Prof. Russ Tedrake<br />
|-<br />
|-<br />
| Guest Lecture 3<br />
| 1/11 at 2:15pm<br />
| Prof. Seth Teller<br />
|-<br />
|}</div>Yichenhttps://maslab.mit.edu/2011/wiki/FAQ_forumFAQ forum2010-12-27T20:31:12Z<p>Yichen: /* Motors */</p>
<hr />
<div>==Porch==<br />
Q: Is the porch "the 14" by 14" area on the floor centered in front of each mousehole" (Playing Field) or "within a 12 inch radius of the center of the yellow goal" (Contest rules)?<br />
<br />
A: The porch is within a 12" distance measured with a ruler from the goal edges in all directions in front of the goal. The porch is the goal area plus 12" in all directions.<br />
<br />
==Motors==<br />
Q: What is the part number of the provided motors?<br />
<br />
A: The specs for the drive motors are:<br />
*Voltage = 12vdc<br />
*RPM = 120<br />
*Reduction = 50:1<br />
*Stall Torque = 123.20 oz-in (8.8 kg-cm)<br />
*Outside Diameter = 37mm<br />
*Length (motor and gear) = 1.92" (4.87cm)<br />
*Diameter (motor and gear) = 1.45" (3.68 cm)<br />
*Diameter (shaft) = 6mm<br />
*Outside Diameter = 37mm<br />
*Current (at 12v no load) = 90mA<br />
*Stall Current (at 12v locked shaft) = 1.5A <br />
<br />
These motors are twice as strong as the ones that were used last year. You can purchase them <br />
[http://www.trossenrobotics.com/store/p/4259-Gear-Head-Motor-12vdc-50-1-120rpm-6mm-shaft-.aspx here]. We have other stock motors that can be used for other purposes but we do not provide full matching sets.</div>Dfouriehttps://maslab.mit.edu/2011/wiki/BotClientBotClient2010-12-25T00:11:13Z<p>Yichen: Created page with "'''BotClient''' allows you visualize your robot's processing from another computer. == Running BotClient == # Run <code>add 6.186</code>. # Run <code>/mit/6.186/bin/botclient</..."</p>
<hr />
<div>'''BotClient''' allows you visualize your robot's processing from another computer.<br />
<br />
== Running BotClient ==<br />
<br />
# Run <code>add 6.186</code>.<br />
# Run <code>/mit/6.186/bin/botclient</code>.<br />
#* If this doesn't work, do <code>java -cp /mit/6.186/2006/maslab.jar maslab.telemetry.botclient.BotClient</code>.<br />
# In the "control panel" window, enter your robot's IP address. <br />
# Channels published by the robot will appear in the control panel. Double-click a channel to display it.<br />
<br />
== Running BotClient from Windows ==<br />
<br />
One of the advantages of BotClient is that it is a platform independent program, because it is written in Java. Any Java code can run on any machine as long as there is a Java Virtual Machine designed for that machine. If you have the latest Java Runtime Environment (1.6.0 or greater) installed on your Windows PC, you can run botclient from the command line and it will function just as if you were running it on Athena. <br />
<br />
Assuming you have Java 1.6.0 or later installed: (If you are not sure, go [http://java.com/en/download/index.jsp Download Java]).<br />
<br />
To run BotClient on your Windows PC, do the following:<br />
# Copy maslab.jar onto your PC, and remember the path you stored it to. (For advanced users, add the jar to your CLASSPATH environmental variable)<br />
# Click Start, hit Run<br />
# In the box, type : <code>java -cp "c:/path/to/maslab.jar" maslab.telemetry.botclient.BotClient</code><br />
# Click ok.<br />
<br />
The botclient window you are used to should pop up momentarily.<br />
<br />
== Channel Types ==<br />
<br />
(See <code>maslab.telemetry.channel</code> in the Maslab API for details.)<br />
<br />
; ImageChannel<br />
: As the mouse pointer hovers over a pixel, the RGB and HSV values are reported. Click to see where the color falls on an HSV colorwheel. You can save an image (the captured image is saved in the directory where BotClient was invoked; the filename is <code>capture''NN''.png</code>), as well as pause/resume a video feed. <br />
; TextChannel<br />
: Yup, text. <br />
; ScopeChannel<br />
: For plotting values graphically over time. <br />
; VectorChannel<br />
: Channel for sending vector graphics to the BotClient. Useful for anotating an image or plotting sensor data in space.<br />
<br />
You can also have your robot listen for commands from the BotClient using the "command" field in the control panel.</div>Yichen