Menu

Team 8 MassLads

Members

Reng Zheng

  • Electrical component assembly
  • Sensor software
  • SWE/EE integration
  • Less Jank Control (still very jank)

Haris Imamovic

  • CADing
  • Manufacturing

Yue Chen Li

  • CV

Shreya Chaudhary

  • Autonomy Stuff + CV + Jank Controls

Links

Github:

Pages for Links:

Strategy

Have fun!

This is an IAP class; and with the variety of other commitments we have our strategy was to maximize our fun above all else. While we are trying to compete, scoring is a secondary concern to how much we are learning and how much we are enjoying the class.

Scoring

That being said, beating the opposition is also a nice bonus. So, what is our strategy?

Don't Reinvent the Wheel

Literally. PID control has been around for decades and is taught at intro levels by the MechE and SWEE departments because its easy and more often than not it just works. PID for driving is therefore the obvious solution, given its mostly a second-order problem.

Cameras and Rangefinders

We have a camera, we have RGB blocks, and we have an IR rangefinder. Together, what do we have? Hopefully a vector, direction and magnitude, towards the nearest block to stack. If we don't, uh, well that is very likely because EE is not our strong suits. But! Ramming warp speed into a block is still good, since a stack of 1 is... 1 point! At minimum! I think (don't quote me on this).

Actuators

If we do manage to do more than ram multiple blocks onto the ground, now what? We grab them with our elevator and servo connected to a pincer. Does it work? I sure hope so. Integration is hard, but this is why we're here! To learn.

Hail Mary

So like, if we flood the field with green light, everything is green, so opposition CV fails. But we have a white light, so our CV doesn't fail. Will this work? No. But will it look pretty? Possibly.

What about points?

So... we don't have time left to do integration tests on an actual attachment, but you know what? The blocks have a hole that goes through it, so maybe our robot can just build a kebab. That gives a non-zero score at least, right? ell, if CV works.

Team Motto

At least had fun!

Things we learned/future advice

  1. More time == more results.
  2. You probably want a dedicated EE. This cost so, SO many hours (>= 20 hours for Reng on SWE debugging an EE issue that he didn't recognize because he's a SWE by training and not an EE. Silent faults waste time folks).
  3. 2 MechE's? Make sure you have 2 MechE's. Often people overestimate the MechE component, we kind of thought it was mostly solveable in software. Turns out, stacking involves complex physical systems too. The controls simulations class lied. It is not, in fact, importable from library. Claws uh, don't work as well as you think they do, when they don't have physical stoppage. That was, an important lesson.
  4. Coding from scratch? Bad idea. Debugging alone? Bad idea. Talk to a TA, real life, unfortunately, has integration errors that are not well documented, unlike a class where 50+ people, every semester, take the class, and therefore have ironed out all the issues by week 2. Oops.
  5. Build bot and SWE for bot EOW2, not EOW3, EOW2 gives you EOW3 to fix stuff, instead of realizing BOW4 that uh, the problem is intractable and its too late to fix the strategum. Oops.
  6. KISS -- Keep it simple stupid. Don't overcomplicate it. Complex == more failure points == more error. Simple == easy to debug.
  7. EE wiring -- rats nests of wires are hard to debug. Things disconnect sometimes. Uh, a dedicated EE could've made such a beautiful wiring job. Alas, the team had Reng. Shreya says Reng was great, Reng says Reng could've done better if he was a pure EE. Alas, 6-1s are supreme over my (Reng's) 6-4 training in this regard. I give them my deepest admiration.
  8. Have fun -- despite the failures, fun and learning was had. You don't need to score points, you just to score fun. It's not g(x) that matters, which is your score, its h(x) your happiness.
  9. Reng wrote this tired, but he's happy to answer questions when he's less tired. Email rengz@mit.edu and you may get a response (please bump if you dont get a response).

Progress Reports

Week 0: ROS + OnShape

  1. We just set up ROS and OnShape.
  2. ROS refuses to install on Mac.

Week 1: Mock Comp

  1. Goals
    1. Baseline
      1. Working drivetrain and motors - Reng
        1. This was achieved! We got working drivetrain integrated with the basic kitware stuff.
    2. Stretch
      1. Sensor-drivetrain coordination - Reng

Jan 17: YC's Notes

[OK] v4l2 camera node

Test camera node: ros2 run v4l2_camera v4l2_camera_node --ros-args -p camera_info_url:=file:///home/team8/team-8/src/kitware/cv/c270cal.yaml

Published to /image_raw.

TODO launch config

TODO homography

Week 2-3 Updates

Stuff happened but also didn't. I really don't know how the weeks went by.

  1. ROS2 Refused to compile in Mac. Reng had to work physically on NUC for a while; got tired so dedicated rest of the week to figuring out how to download a Linux VM on Apple Silicion and then trying to compile ROS on that (finished EOW2).
    1. VSCode and Nano apparently are very finnicky in VMs (text typed would just disappear into the ether until a newline was inputted). Beginning of week 3 set up SSH into ROS2. Began PID control without IMU to test.
  2. Allegedly (Reng finds out later this is wrong) arduino only accepts up to 3.3V into its main rails and has big warnings that most pins aren't 5V compatible. So, the encoders, which output minimum 3.5V, cannot be used. Reng's solution? Voltage dividers! Z1 = 30 Ohms, Z2 = 10 Ohms, input voltage of 12V gives 3V reads so this should work, right?
  3. YC and Shreya messed around with vision
  4. Software moved surpringly slow, honestly don't know why this took 2 weeks till reality slapped in. Update by Reng: Integration sucks; apparently TAMProxy can create shadow threads and if you accidentally spin up multiple TAMProxies this creates race-conditions where you thread-lock the serial port. This makes that serial port unuseable, which we didn't figure out until way too late. Oops.
  5. Basically we just waited for IMUs? And messed around with software scary stuff. PID is impossible without a angular and accelatory reference frame to correct to. So that's fun.

Weekend Before Competition Grind

Update (Weekend Before Comp.): We got camera working, got autonomony working (very jank essentially just calculate coordinates and go towards the values, not at all smooth but we've got PID working in parallel).

Week 4 Updates

  1. Encoder is getting better

    1. Reng update: the voltage dividers apparently aren't very good at conveying signal. Who knew? Putting 3.3V into VCC so that it is the output apparently was the solution. For future reference, the encoder documentation on these MASLAB motors are wrong.
    2. Reng update: Encoder IS WORKING. I REPEAT WE HAVE MOVEMENT. (If it goes faster than 4 rots/second with 3 7/8" wheels it flings its battery off. I call this a single-use kinetic weapon. Definitely not a bug of setting the wheelspeeds too fast/not securing the battery. Nope. Good news though, the bot starts tipping too far forward if we go too fast before then so like... not a SWE issue. Consult MechE /jk).

      Video:

  2. Vision-based control is getting closer, needs tuning
  3. We've done more EE to get the ToF ready and got the Tamproxy connect working
    1. Next time we need to conscript an EE. I bet they already knew voltage dividers wouldn't work (and I should've googled it). :( -- Reng
  4. We've considered just juggling instead of competiting. ++fun;

TODO: need to update matrix:

rqt_image_view will be a useful debugging tool here. If you enable mouse clicking (there is a checkbox next to the topic name), then rqt_image_view will publish the pixel coordinates of points you click on in the image to a topic like this: /zed/rgb/image_rect_color_mouse_left. Publish a marker to RVIZ using this pixel (we've provided you with a function draw_marker in scripts/homography_transformer.py), and you should be able to quickly tell if your homography matrix is doing its job. Note that publishing messages from your local computer to the car's ROS master can be finicky depending on your OS, so you may need to record a ROS bag of camera data and do this testing locally.

Day of Competition (+ All-Nighter Before) Updates

  • Our NUC died. Just didn't power on (staff is currently trying to figure out why). Perhaps it's a sign. So we got a new NUC. (Thank you, staff!)
  • We essentially decided to go with Week 1 strategy. So we used the scrapbook glue from the BLo to glue a stick on our robot and reduced it down to a peg-in-hole problem
  • Then the less jank CV turned out to be more jank. So we reverted back to the old code (which no longer had that awful delay for some reason) and integrated the encoders
  • The flood lights were a lot less powerful than expected. So, instead, we just had the back red lights and kept a slight white light in front for us
  • Camera angle matters!!

Competition

  • Our final code was our Week 1 backup plan: stick, peg-in-hole, reliant purely on CV. We also had the red lights to interefere with others' CV but that didn't really work.
  • We did pretty well in our first round, getting 5 points (the max for the round!)
  • Then our motors died. Tragic.
  • But still! We're overall pretty happy with what we were able to do!

Takeaways (more Lessons Learned)

  • Do more practice on the actual mat! (We did a ton of work on the ground but mat was a different experience)
  • Hit deadlines more? It would've been nice if we more consistently did the end-of-week competition to get more competition experience
  • There's a lot more but I think we're tired now