Team 2: C.H.I.L.E.
- Claire Traweek
- Henry Clawbot
- Isaac Metcalf
- Luis Trueba
- Eric Boehlke
This week we designed our robot, assembled the kitbot, and began experimenting with the NUC.
We began this week with a dump-truck like design that earned points primarily by pressing the Novartis button, catching the falling box, and then raising the bed of the robot to stand the blocks up. We prototyped the design, and then laser cut it from MDF. After assembling it, we found that there was a problem with getting the blocks to stack; often, the blocks would slide down and jam. In response to this, we designed another robot that drives around with a block-shaped tube and directly picks blocks up. We plan on assembling the second prototype next week.
On the software front, we began working with ROS, and were able to both power motors and collect sensor data. We plan to combine data with motor control to make our robot drive in a slow next week.
Our shift in mechanical design caused a shift in strategy. Before, our strategy was to collect Novartis blocks. Now, our strategy is to identify block stacks of our color and approach them.
Hardware Reflection This week saw a dramatic shift in our team’s mechanical design. After finishing our first prototype of the robot last week, we decided that the several drawbacks that design possessed (including lack of space for electronics, low scoring potential, and an unreliable stacking mechanism) were considerable enough to explore other designs in tandem. The idea of a “claw machine” type mechanism to pick up and stack blocks of our color was proposed, and was almost fully CADed by the end of the weekend in a flurry of solidworks excitement.
The claw-bot gives enhanced accuracy of stacking and much more room for electronics than the dump truck design, but pays for it with increased mechanical complexity. We designed this second robot, as we did the first, so as to minimize the number of active electronic systems the robot would have. The design uses two motors for driving, one motor for raising and lower the claw, one servo for opening the claw to drop the stack, and a second servo for hitting opposing-colored blocks onto the robot. Hopefully the code for the robot’s movements will be simple enough for our software team to handle (we’re rooting for them).
On Tuesday and Wednesday the working prototype of the robot body and its two-foot “claw” were laser-cut and assembled, and the mechanisms were tested. Initial tests were promising- the claw mechanism seems sturdy enough to pick up more than 12 blocks, and can also open with relative ease to drop stacks smoothly. We 3D printed the linear bearing mounts needed to attach the claw to the robot, but the acid bath was disconnected and so we delayed attachment to next week. The rest of our time was spent troubleshooting mechanical systems, and designing new parts to compensate for unforeseen problems. By the end of the week the arm of our robot looked like an elementary school craft project held together with Gaff tape and hot glue.
With the Novartis dispenser completed by Thursday, our mechanical team had a new set of mechanisms to design. The problem of activating novartis in such a way that our robot could easily acquire the blocks was given significant thought, and to account for possible issues with stacking 12-high in the future a two-option system was built into our robot’s frame. We can choose to have the robot face forward when activating novartis, letting the blocks dispense into the block-acquisition area below our claw mechanism. Alternately, we can have it drive in backwards and hit the button to dispense the six blocks directly onto its bed, earning 12 points rather than 36 or more but earning them reliably. The specifics of the pushing system have yet to be designed, but in general we want to pursue one that doesn’t require electronics or super-accurate color-sensor-enabled steering. One possibility is to simply have two vertical walls on the front of our robot, one on the left and one on the right, and to lift up whichever wall corresponds to the side of the novartis mechanism we don’t want to mess with.
Next week we’re putting the two halves of our robot together, and handing it over to the software team to program. We’ll continue working to perfect the delicate mechanical systems that our design is contingent upon, such as the plastic claws on our stacking device. At the same time, we’ll address new problems as they come up and monitor the performance of different components to see if they need to be redesigned. One such component we’re worried about currently is our bot’s angular base, which may get stuck in corners of the game area. Another is the small arm on our claw mechanism used to push blocks next to the block we’re picking up far enough away that they don’t interfere. It’s nearly certain that this part will have to be redesigned, but we’ll need to see it in action before we do choose to replace it.
In the event that claw-bot doesn’t perform any better than your standard claw machine game, we’ll return to last weeks design and see what we can do to improve it. Some ideas were proposed this week, such as a second level to hold electronics and an enclosed lifting mechanism to stop blocks from falling out prematurely. These deserve exploration, and seeing as we have unlimited MDF we’ll cut some new parts for dumptruck-bot next Monday or Tuesday.
Software Reflection This week in software, we were able to make the motors run backwards and forwards using the Teensy. By the end of the week, we were able to make the robot drive in a square using an open loop. We can collect data from our sensors, and are close to making our robot drive in a square using the gyroscope. We have taught ourselves ROS and are using it to control the robot. Simultaneously, we are experimenting with computer vision. We started by reading the numpy documentation and are now in the middle of reading the cv2 documentation. We can isolate all single-colored pixels in an image. In the coming week, we hope to be able to attach sensors to the completed robot, complete our square driving quest, and be able to identify cubes.
This week, we focused heavily on robot assembly and software. We built the initial robot, identified design flaws, and then re-cut and modified parts as needed. For example, in our original design, we failed to account for the space taken up by the coupler used to mount the threaded rod to the robot. This meant that our threaded rod was higher up than we thought, and would not allow the claw to lower enough to pick up blocks. We considered replacing the block storage device completely, but eventually decided to just make the grabbers on the claws longer. This appears to have solved the issue.
After using the previous two weeks to learn about ROS and TAMProxy, we now attempted to write code for our robot. The beginning of the week involved focusing heavily on computer vision. By Wednesday, we could consistently identify cubes.
We did this by using cv2's color filtering to isolate pixels of only one color. We adjusted ranges to insure that the cubes were (for the most part) the only things that were detected. We then used blob detection to pinpoint the size (which is related to distance) of the cube sillhouettes we were detecting. By defining a minimum blob size, we eliminated much of the noise that existed. The blob detection class did not tolerate the holes in the blocks, so we wound up using the fill function to prevent blocks with visible holes from not being recognized. We used blur to improve our accuracy.