Team Six/Final Paper

From Maslab 2015
(Difference between revisions)
Jump to: navigation, search
Line 26: Line 26:
  
 
===Threads===
 
===Threads===
We had 8 threads running on the robot. A high level decision thread, a motor control thread, a servo control thread, an actuator thread,, a logging thread, and two threads listening for interrupts for the encoders.
+
We had 9 threads running on the robot. A high level decision thread, a sensors thread, a motor control thread, a servo control thread, an actuator thread, a logging thread, an Image Processing thread, and two threads listening for interrupts for the encoders.
  
 
The data was shared among the threads by the use of shared memory (class public member variables, getters and setters).
 
The data was shared among the threads by the use of shared memory (class public member variables, getters and setters).
Line 42: Line 42:
 
This thread had 5 pointers, that pointed to the desired value that should be written to the motors or servos, which were received, in our case, by our servos control thread and motors control thread.
 
This thread had 5 pointers, that pointed to the desired value that should be written to the motors or servos, which were received, in our case, by our servos control thread and motors control thread.
  
====Motors control thread===
+
It implemented a '''maximum power threshold''' to the motors, so that the robot could never go above a certain speed, set by the config file, and made our robot safe to test.
 +
====Motors control thread====
 +
This thread was responsible for deciding what power should be written to the motors by the actuators thread.  It implemented:
 +
**'''PD controller on angle''', so that we could set a desired direction to the robot and make it turn fast and efficiently, make it go straight or make it do dynamic turns.
 +
**'''Proportional control on position''', so that the robot could effectively move a certain distance
 +
**'''Torque limiter''' for the wheels. We used an approximation that the torque is "approximately" proportional to the difference between the normalized wheel speed and normalized power set to the motors, and limited this value. This assured that the wheels would never slip on the floor and improved our error in distance moved from 10% to less than 5%.
 +
**'''Power minimum threshold'''.  There is a minimum limit on how much power you have to apply to the motors before the wheels start moving. Since we were using a proportional gain, if we didn't use this, our robot would be incapable of moving very small distances or very small angles. It improved our error in position from ~5 inches, to < 0.5 inch, and our error in angle from 5 degrees to <1 degree.  It had the problem of making our robot never arrive to a static situation, it would always be shaking a few 0.1s of degree from left to right trying to fix its angle.
 +
**'''Position, angle and angular speed tolerance'''. In order to make the robot not shake all the time, we had a angle error and position error tolerance, so that if the angle or position error got smaller enough, the robot would stop trying to fix itself.
 +
 
 +
This thread had a pointer to the sensors class, and it didn't directly communicate with the actuators. The actuators that would point to the values of the variables of this class and read from them.
 +
 
 +
It could receive only 2 instructions "How much to move? How much to turn?", which was a great abstraction layer to be used by the high level threads.
 +
 
 +
====Servos control thread====
 +
The servos control is very similar to the motor control thread, but a lot less complex. It would have variables containing what should be the angle of the servos, which would be read by the actuator class.
 +
It could set the sort mechanism in sweep mode, it could raise or unraise the robot arm, and hook or unhook a block.
 +
 
 +
It provided the abstraction layer to the higher level classes that allowed us not have to deal with the exact angles of the servos anymore.
 +
 
 +
====Encoders thread====
 +
Many sensors had to use their own thread. In the case of the encoders, the thread was responsible only for listening to the rising and falling edge of pins in order to keep track of how much the wheel had spin.
 +
Other sensors that made use of their own threads, but that we decided not to use were: the gyroscope and the ultrasonic sensors.
 +
 
 +
====Logger thread====
 +
Our logger appended information from all the modules (sensors, high lever, motor control, servo control, image processing, etc), to a file that could be read later in order to debug. It also updated the last picture taken by the robot on a server (python -m SimpleHTTPServer 80) running inside the robot, which allowed us to see what the robot was seeing. It also saved all the transactions that our state machine was doing.
 +
 
 +
Our logger class was not thread safe, and it never intended to be, since a messy or partially broken logger file was not important.
 +
 
 +
====Image Processing thread====
 +
This thread would take images from the camera, remove the image data that was above the walls, detect blocks and report their position and angle with an error of at most +2 inch and <4 degrees for blocks that were at most 30 inches away. It could also detect blocks that were farther away, but the errors were slightly larger or the robot was incapable of driving with such precision.
 +
 
 +
It implemented 3 functions that could be used by the high level decision thread:
 +
**foundCube()
 +
**nearestCubeDistance()
 +
**nearestCubeAngle()
 +
 
 +
It also averaged the data across multiple frames to increase the precision of the position and angle of the block found.
 +
 
 +
====High Level decision thread====
 +
This thread was represented by a main function that ran a state machine that implemented procedures (that could also have states inside). It had access to all the other threads, but didn't communicate directly t

Revision as of 08:18, 2 February 2015

Contents

Introduction

Janky the cocoabot won the competition on MASLAB 2015. It had a very simple design and its code was very modular.

Software

The whole code can be found on our github repository: [1]

Language and Libraries

The whole code was written in C++11, with the exception of the code present in our Arduino, responsible for communicating with the color sensor, which was written in C.

We decided first to use C++ since most of the example code for the sensors was in C++ and no one in the group felt inclined about making use of JNI to interface with the hardware. Then later, we moved to C++11, to make use of the Thread support library "<thread>" and Date and time utilities "<chrono>".

We used Opencv for our computer vision code. We used standard C++ libraries, "string", "unistd", "cmath", "cstdio", "fstream", "iostream", "signal.h", besides libraries related to containers, such as vector.

Design

We decided that modularity and easy integration were very important for our design. Because of this we made 3 decisions about the code, that later helped us considerably:

    • Multi-thread:

The code would be multithreaded, so that it could be more easily debugged, and the implementation of timeouts would be easier.

    • Config file

We would have a main config file that would have all the configurations of the robot, such as number of the pins used by the sensors, proportional constants and gains, definitions of what was and was not present on the robot, etc. There was a considerable amount of macro preprocessing o the code for convenience. By changing a 0 to a 1, we would add new members to our sensors module or change how they were updated.

    • Single main function

Since early in the development, we were trying to avoid write code that could not easily be reusable. Instead, we had a single main function on the source that took parameters, and those parameters would redirect the flow of the main function to run the tests. This made sure that all the code could be compiled together, and that it was easy to implement tests that made use of mutiple modules of the robot.

Everything on the code had "unity" test cases to show that they worked on the robot, and had also simple integration tests, to show that it didn't conflict with other parts of the code. This made it easy to debug when something was going wrong.

Since the beginning of the development we had a make file to compile all the code together, so that we wouldn't have to spend time later having to spend time figuring out what .o files were missing for our code to compile.

Threads

We had 9 threads running on the robot. A high level decision thread, a sensors thread, a motor control thread, a servo control thread, an actuator thread, a logging thread, an Image Processing thread, and two threads listening for interrupts for the encoders.

The data was shared among the threads by the use of shared memory (class public member variables, getters and setters).

Sensors thread

Our sensors thread (or sensors module), updated the data from all our sensors and applied a simple filter to the data acquired. It update rate was considerably higher than the one from any of the other threads, since all of them depended on the data from the sensors. This thread was also responsible for updating the time elapsed (in microseconds), which would be used on the rest of the robot to calculate speed, or time elapsed for tasks.

All the sensors were private members of the main class of this thread, and the other threads could only access the data from the sensors by reading the public variables of this class. A pointer to the sensors module object was passed on the constructor of many of the other classes.

Actuators thread

The communication to all our servos and motors were interfaced by a servo shield. A problem with this is that we could only communicate with one motor or servo at a time (even though the delay time was very small). We had a thread that would just read values to write to the servo shield so that our main code wouldn't have to wait for this communication to happen.

This thread also helped was with the problem that the clock rate of the edison board was not constant, because (probably) there was noise from the dc converters. Some of our writes to the motors and servos would fail, and by having a thread writing values to them non-stop made sure that we were resilient to those write fails, even if only 10% of our writes worked.

This thread had 5 pointers, that pointed to the desired value that should be written to the motors or servos, which were received, in our case, by our servos control thread and motors control thread.

It implemented a maximum power threshold to the motors, so that the robot could never go above a certain speed, set by the config file, and made our robot safe to test.

Motors control thread

This thread was responsible for deciding what power should be written to the motors by the actuators thread. It implemented:

    • PD controller on angle, so that we could set a desired direction to the robot and make it turn fast and efficiently, make it go straight or make it do dynamic turns.
    • Proportional control on position, so that the robot could effectively move a certain distance
    • Torque limiter for the wheels. We used an approximation that the torque is "approximately" proportional to the difference between the normalized wheel speed and normalized power set to the motors, and limited this value. This assured that the wheels would never slip on the floor and improved our error in distance moved from 10% to less than 5%.
    • Power minimum threshold. There is a minimum limit on how much power you have to apply to the motors before the wheels start moving. Since we were using a proportional gain, if we didn't use this, our robot would be incapable of moving very small distances or very small angles. It improved our error in position from ~5 inches, to < 0.5 inch, and our error in angle from 5 degrees to <1 degree. It had the problem of making our robot never arrive to a static situation, it would always be shaking a few 0.1s of degree from left to right trying to fix its angle.
    • Position, angle and angular speed tolerance. In order to make the robot not shake all the time, we had a angle error and position error tolerance, so that if the angle or position error got smaller enough, the robot would stop trying to fix itself.

This thread had a pointer to the sensors class, and it didn't directly communicate with the actuators. The actuators that would point to the values of the variables of this class and read from them.

It could receive only 2 instructions "How much to move? How much to turn?", which was a great abstraction layer to be used by the high level threads.

Servos control thread

The servos control is very similar to the motor control thread, but a lot less complex. It would have variables containing what should be the angle of the servos, which would be read by the actuator class. It could set the sort mechanism in sweep mode, it could raise or unraise the robot arm, and hook or unhook a block.

It provided the abstraction layer to the higher level classes that allowed us not have to deal with the exact angles of the servos anymore.

Encoders thread

Many sensors had to use their own thread. In the case of the encoders, the thread was responsible only for listening to the rising and falling edge of pins in order to keep track of how much the wheel had spin. Other sensors that made use of their own threads, but that we decided not to use were: the gyroscope and the ultrasonic sensors.

Logger thread

Our logger appended information from all the modules (sensors, high lever, motor control, servo control, image processing, etc), to a file that could be read later in order to debug. It also updated the last picture taken by the robot on a server (python -m SimpleHTTPServer 80) running inside the robot, which allowed us to see what the robot was seeing. It also saved all the transactions that our state machine was doing.

Our logger class was not thread safe, and it never intended to be, since a messy or partially broken logger file was not important.

Image Processing thread

This thread would take images from the camera, remove the image data that was above the walls, detect blocks and report their position and angle with an error of at most +2 inch and <4 degrees for blocks that were at most 30 inches away. It could also detect blocks that were farther away, but the errors were slightly larger or the robot was incapable of driving with such precision.

It implemented 3 functions that could be used by the high level decision thread:

    • foundCube()
    • nearestCubeDistance()
    • nearestCubeAngle()

It also averaged the data across multiple frames to increase the precision of the position and angle of the block found.

High Level decision thread

This thread was represented by a main function that ran a state machine that implemented procedures (that could also have states inside). It had access to all the other threads, but didn't communicate directly t

Personal tools