Team Six/Final Paper

From Maslab 2015
Revision as of 07:23, 2 February 2015 by Team6 15 (Talk | contribs)
Jump to: navigation, search

Contents

Introduction

Janky the cocoabot won the competition on MASLAB 2015. It had a very simple design and its code was very modular.

Software

The whole code can be found on our github repository: [1]

Language and Libraries

The whole code was written in C++11, with the exception of the code present in our Arduino, responsible for communicating with the color sensor, which was written in C.

We decided first to use C++ since most of the example code for the sensors was in C++ and no one in the group felt inclined about making use of JNI to interface with the hardware. Then later, we moved to C++11, to make use of the Thread support library "<thread>" and Date and time utilities "<chrono>".

We used Opencv for our computer vision code. We used standard C++ libraries, "string", "unistd", "cmath", "cstdio", "fstream", "iostream", "signal.h", besides libraries related to containers, such as vector.

Design

We decided that modularity and easy integration would be

Threads

We had 8 threads running on the robot. A high level decision thread, a motor control thread, a servo control thread, an actuator thread,, a logging thread, and two threads listening for interrupts for the encoders. The data was shared among the threads by the use of shared memory (class public member variables, getters and setters).

Sensors thread

Our sensors thread (or sensors module), updated the data from all our sensors and applied a simple filter to the data acquired. It update rate was considerably higher than the one from any of the other threads, since all of them depended on the data from the sensors. This thread was also responsible for updating the time elapsed (in microseconds), which would be used on the rest of the robot to calculate speed, or time elapsed for tasks.

All the sensors were private members of the main class of this thread, and the other threads could only access the data from the sensors by reading the public variables

Personal tools