fredag den 20. juni 2014

End Course Project

Dates + duration of activity:
project duration: from 15/5 to 19/6
hours spent: approximately 79 hours over 11 work days (237 man-hours)

Group members:
Benjamin Grønlund
Christina Gøttsche
Levi á Torkilsheyggi




The project at its presentation point The following pictures show the track, the delivery truck and the dispenser station as they were at the final stage.




A video (that can be found on vimeo.com/98587382) shows how the entire system works. 
Here the dispenser station starts out with filling the delivery truck's bed with red M&Ms, and stops the rotation of the conveyor belt in the dispenser station as soon as it registers an M&M with a color different from red. The delivery truck then travels to the starting point on the track for red, follows the black line, and stops when it sees the red colored area, then rotating counter-clockwise until it sees the black line (which has been extended a bit at in that area). It then returns back to the dispenser station through three coordinate points before rotating about 90 degrees, to stand (almost) ready to receive new M&M(s) onto the truck bed.
We then have to place the truck exactly correctly and then press the touch sensor, enabling the dispenser station to rotate the conveyor belt once more and dispensing M&Ms of the same color onto the truck bed. This time one green M&M is loaded, and only one seeing as the next M&M on the conveyor belt is a yellow M&M.
The clip also shows yellow M&Ms and a blue M&M being delivered, along with another perspective view of the truck delivering green M&Ms.



Process



1) Working with the Bluetooth connection


Sending and receiving data between a LEGO NXT and a computer was used in lesson 5 [1], and the experience gained there helped with implementing the Bluetooth needed for this project.
Connecting two NXTs was not previously experienced though and as such some coding and experimentation was needed. After trying for a while to get the two NXTs to communicate via Bluetooth without success, it was discovered that they were not paired properly and therefore could not communicate. After pairing the NXTs properly, the connection started to work, and it was possible to send integer values between the two NXTs.
It was however decided to restructure the communication for more ease of surveying data between the devices, and for having the opportunity to connect more than two NXTs if that should show necessary. So instead of letting the communication happen between the NXTs, the project should use a computer as communication base instead. In lesson 3 [2] we solved the problems we had with getting the computer to search for and find a new NXT, and we thus knew how to connect the computer to new and multiple NXTs. We did this (on Windows) by deleting the file nxj.cache from the user account on the machine (C:\Users\<Username>\nxj.cache) and running the program nxjbrowse.bat. When having deleted the file, the program searches for all NXTs upon opening, thus creating a new nxj.cache file containing found NXT’s from a Bluetooth inquiry.


The following described is a simple program we constructed for making the multi-connection work:
First off both NXTs wait for an incoming bluetooth connection when starting their program, and this is displayed with a text on the LCD. The computer has previously been paired with the NXTs, and it thus recognises them. When the computer program runs, it connects to both NXTs. Then it sends an integer, 1, to NXT1, updating NXT1’s LCD to “I received!” for one second. After the one second, NXT1 sends 1 back to the computer, and changes its LCD to “Waiting”. When the computer receives 1 from NXT1, it sends the integer 1 to NXT2, and NXT2 reacts the same way as NXT1. This goes in a loop for steady testing. 

This provided a base for a working bluetooth connection, and allowed for making a bluetooth connection between the dispenser station and an NXT (a replacement for the truck since the truck was being used to develop on by another group member). It was decided to only send integers between the devices as strings can act differently with character encodings. Furthermore the dispenser station’s colorsensor’s getColor() method returns an integer depending on the color read.
First off, the NXT outputs the integer 1 to the computer whenever its left button is pressed. The computer prints the received integer in the console, and outputs it to the dispenser station. The dispenser station then drops an M&M on the chute, and then outputs an integer depending on what color M&M is dropped, for example is red 0.
By using that base it was possible to get the computer to act as relay-point for bluetooth connections between the truck and the dispenser station, and by printing the values to the console we could survey the communication.



2) Working with the delivery truck


First of all, a truck bed was needed that could hold an M&M without it falling off, because of the movement from the truck. Therefore the truck bed has a slight tilt towards the truck cabin (the NXT) so that the M&M(s) can be transported safely without it/them falling off.
The truck bed is attached both to the motor and to two adjacent holes next to the hole for attaching to the motor - this is done so that the truck bed can be tilted upwards and the M&M(s) can slide off the truck bed.
The pictures below shows the described construction.




The actions for the delivery truck is implemented using a program that has methods with sequential control and methods with reactive control.
As Fred Martin [3] describes: “sequential control refers to a procedural series of steps or phases that a robot’s program moves through in service of accomplishing some task”, and “Reactive control refers to a collection of stimulus-response behaviors that dynamically trigger and retire as the robot moves through the physical environment” [3, p. 190].
Our program is sequential as it has to do certain actions (some for a certain amount of time) in a certain order, and it is reactive when it has to do something dependent on what it registers from e.g. the light sensor or the color sensor.
For instance: the line following is a reactive method and reaching a colored zone and changing it’s actions accordingly is reactive. Rotating the wheels until some value for the tachocount is reached is somewhat reactive, but we don’t handle it’s 'reactiveness' (the NXTRegulatedMotor class does), but we just use it sequentially when we make the wheels travel to some point and then to another point. In this way you can say that traveling to coordinates is sequential as the truck doesn’t react to anything when performing the method of going e.g. back to the dispenser station.

Implementation
The behavior of the delivery truck, and how it’s implemented, is described here along with the challenges that was faced along the way.

The driveToContainer() method
Making the truck deliver the M&M to the correct container is done by letting the truck follow a black line (black tape on white paper) going towards the container. This is done by using a light sensor to infer whether the truck should turn a bit right or left in order to stay on the line.
In order for the truck to be able to follow the correct black line to the correct container, the truck is first driven to a ‘starting zone’. This starting zone will have the color of the currently transported M&Ms, and will have a black line going out from this zone all the way to the unloading area.
It was needed to have something to register that the truck had gotten all the way to the unloading area, and this was implemented by making a colored area in front of the container, and use a color sensor to recognize this color. The decision to use the method with the color sensor was taken, as experiments with using the already attached light sensor had bad recognisability between white/black and other colors. This is because the different colors values lie in the same range as between the values of white and values of black.

A drawing of how we figured the arrangement of the ‘factory’ should (roughly) be, using black lines and colored areas is shown below.

A picture of the colored spot, for the truck to recognize it had reached the unloading area, can be seen below.


The turnTruck(...) method
When the truck recognizes that it has reached the unloading zone, it should turn 180 degrees in it’s own place, in order for the truck bed to be facing the container. This was implemented using the rotateTo-method, using the argument 375 for one wheel and -375 for the other wheel. Both motors had their tachometer reset prior to this. The value of 375 was found by trial and error, as it was dependent on the space between the motors.

The tipTruckBed() method
Once the turning about is accomplished the truck bed tilts (also using tachometer count and having the motors tachocount reset first) to about 50 degrees, waits for 1.5 second and then rotates the truck bed back to the position it was reset from (which is the ‘relaxed’ position).
This activity that successfully unloads M&M’s can be seen in the video below.



The code for the action of the truck once it has reached the unloading zone can be seen below.


It was discovered later that for this method to rotate the truck to stand correctly in front of the container after a 180 degree turn depended heavily on the truck having the right orientation, when it stopped in front of the container. For other methods to work properly and since that was difficult to achieve, this method was changed later on in the process. This change will be described later.

The main method
Once it had performed these above described methods, the program was tested by making it perform all these methods in a while loop, only breaking when the escape button was pressed.
It was found that the car functioned properly in the first run through the while loop, but that the driveToContainer method didn’t perform the same way the second time the method was called in the while loop. This time it stood still, moved a bit forward, then a bit backward - standing still but shaking in it’s own place.
It was suspected that this had something to do with the fact that the program was controlling the motors using the NXTRegulatedMotor class(the reason for using this class is explained later) in some methods of the program and the MotorPort class in other methods of the program.
After some trial-and-error experimenting and advice from our lecturer, Ole Caprani, it was found that this was the root of the problem, because using the NXTRegulatedMotor class and its rotateTo() methods automatically enable regulation [4], and this regulation interferes with the Motor class trying to control the motors. The problem was solved with a method from the NXTRegulatedMotor class, called suspendRegulation(). This method was then called when all activities in the turnTruck(...) had been performed, thus eliminating the NXTRegulatedMotor class' problematic interference. This removed the problem of the car not performing the driveToContainer method properly.

This finding was used to suspend the regulation of the motors at the end of all the methods where the NXTRegulatedMotor class was used. However later it was found that maintaining the tachocount registration was preferable (because it was initially thought that this meant that it was possible to have a continuously updated coordinate system around the track). This meant that we had to stop using the Motor class and instead use the NXTRegulatedMotor class at all places for the driving motors.
This change caused some problems because the line following stopped working properly when using the NXTRegulatedMotor to control the motors right and left according to whether the light sensor registers black or white. The code for this way of doing the line following is shown below.



This code caused a delay between the execution of the if/else statements and the new test of the statement “lightValue > blackWhiteThreshold”. This meant that between the cars light sensor being over a black line, and the car performing the corresponding action, between 1 and up to 5 seconds had passed. A lot of time with trial-and-error went into this, however not solving the problem in a satisfactory way. To produce a line follower that at least was able to follow the line, even though it doesn't do this very smooth nor pretty, one motor was set to rotate a small amount and the other an even smaller amount, instead of moving one motor forward and stopping the other. This produced a ‘bumpy’ line follower, but nevertheless actually worked.
Optimizing this line following activity was not a priority at the time, as it was important to focus on getting the entire system to work together. However an optimization is described in a later chapter.

Calibrating the DifferentialPilot for use with Navigator
For the creation of a navigator and piloting the truck, we used lessons learned from week 10 [5], and course literature [6].

A program for calibrating the DifferentialPilot's arguments of wheelDiameter and TrackWidth [7] was created. This could send the mentioned arguments via bluetooth to the truck, which then was supposed to travel 20 cm forward and then perform a 180 degree turn.
A pen was attached to the front of the truck and the values for the arguments was found empirically by measuring the drawing of the actual traveled length, set by the value for wheelDiameter and fine tuning the value for trackWidth for getting the truck to turn 180 degrees (so that it would exactly return the the line it had been drawing when traveling 20 cm forward).
Having found correct values the pen was taken off the truck because the friction between pen and paper affects the values. Using a pencil and ruler, the values was then finetuned more to fit the truck that didn’t have the pen attached. The drawings from the experiments performed for tuning these values can be seen below.



Coordinate system
The coordinate system of the truck was found to be in a way that the x axis points forward from the car and the y axis to point perpendicular to this to the right of the car. The picture below illustrates the found coordinate system.


Even though the DifferentialPilot that handles the trucks coordinate system had been calibrated, it seemed that the coordinate system got displaced from the true one as the truck traveled the track. The reason seemed to be because small errors accumulated over time, probably in part because of friction from the back wheels, although they could rotate 360 degrees, and probably because even smaller errors in the calibration accumulate to larger errors. We would have liked to use other wheels, but they had to have the right size in order to not block the trucks movement and to make it tilt in the right amount, and the chosen wheels were the best we could do at the time.
The finding of the erroneous traveling came when a program: BTctrlTruck [8], was run on the truck specifically made for sending coordinates over Bluetooth that is should travel to.
The GUI for the program is shown here:

A video that shows the program sending the truck to the correct coordinate found for the starting point of the line to the red container is shown below.




When the truck had been sent around the track for some coordinates it was found every time that what used to be (0,0) ended up being something else, like (3,-9). The following drawing with noted coordinates found using the program BTctrlTruck shows the found coordinates for linefollowing starting positions and some positions for returning the truck.


Problems
When running the program on the truck for delivering the M&Ms and returning to the base station, it also seemed that the coordinate system didn't get updated as the robot followed the line. This caused the coordinate system to be totally off from the true one, when the robot was supposed to return to its base station (the loading area). While pursuing the reason for this problem, it was discovered that calling methods on the individual NXTRegulated motors didn't cause the position, handled by the Navigator, to be updated as previously believed.
This was in part found when printing the Navigators position for each time a lightValue was acted upon (turning either a bit left or a bit right) to the RConsole [9] that was found a handy tool. A picture of the printouts in this console is shown below.

In the linefollowing method the same coordinates are being sent over and over, while the truck does the line following (20.92017, 18.19832). When the truck has reached the unloading area, turned around and tipped the truck bed, the truck uses the navigator to travel to the base station where it again uses navigator to go the colored starting spot and then starts line following once more. Now new coordinates are sent (20.83061, 18.26406) when in the line following method, but these coordinates also keep on being the same as the truck follows the line.

Solution
As it was needed to control the individual motors when performing eg. line following and rotation of the truck (which isn't possible with the Navigator class), instead of giving absolute coordinates at all times, the idea of having an updated coordinate systems at all times was discarded.
It is possible that using methods from DifferentialPilot could achieve the same actions, but seeing as the coordinate system couldn't be exploited fully anyway when errors accumulated, there wasn't much need and use for a constantly updated coordinate system (as it wouldn't be true to the actual coordinates anyway).
The idea was then to reset the coordinate system when needed (or when necessary) and empirically find the coordinates which were needed. This means that when the truck has unloaded the M&Ms, the coordinate system is reset and coordinates for sending the truck back to the dispenser station/loading area are found. Handling it this way means that there essentially are 5 different coordinate systems - one where (0,0) is where the truck is at the base station getting an M&M loaded and the four other are where (0,0) is at the front of the respective colored unloading areas with the trucks bed facing the M&M container.


This way of handling it also meant that it was no longer needed to use the NXTRegulated Motor for all classes, since it wouldn't cause problems with the position (as previously thought). It is therefore possible to suspend the regulation whenever needed and use the Motor class instead if preferred.
This also allowed for changes in the behavior of the driveToContainer method to use the Motor class and it’s controlMotor method. Even later on it was experimented with using a PID. This is described in a later chapter.


Coordinates for returning to dispenser station
When resetting the coordinate system every time the truck has reached the unloading area and turned around, the cohesion between this new coordinate system and the coordinates for the truck to return to the base is important, in such a way that the truck always faces the same way after having rotated 180 degrees to have the truck bed face the container. If not, then the coordinates empirically found when the truck has the correct orientation will not match the coordinates for the trucks coordinate system where the truck is facing a more slanted direction.
A result found with the truck driving around, is that the truck doesn't always face the colored area in the unloading are with the same orientation, meaning that it doesn’t always stand perpendicular to the container.
The pictures below illustrates the problem with the truck not being perpendicular to the container before rotating and therefore also not perpendicular after rotating.




A way to solve this was to modify the area where the color sensor should recognize that it should turn around.
This was achieved by changing the 'tag', in effect ‘prolonging’ the line on the left, that the light sensor sees and follows. This change removed the risk of the truck starting to go more to the right because it kept seeing the black line.
The evolution of the colored spot can be seen below, and the last picture illustrates the way the spots ended up. 



This solution worked acceptably at this point in the project.

When dealing with the different coordinate systems, a lot of empirical data has to be collected in regards to what coordinates to send the truck to for getting it back to base. This depends on where the truck comes from, and the orientation of the truck when it has turned around in front of the container.

In order to get it back to the base in the right manner, the truck is first sent to a coordinate that has the same x-coordinate as the base point but a greater y-coordinate. Then the truck is sent to the base coordinate. This 2-step traveling ensures that the truck reaches the base area parallel to the orientation it had when it left, and the truck can then be set to rotate 90 degrees, so that it parks under the chute of the dispenser station. If the truck was sent directly to the base area from the containers, it’s displacement from it’s correct orientation would be different for each of the 4 coordinate systems. But handling in this (at least)2-step way makes it possible to exploit the fact that exactly how much to rotate the trucks for getting it in the correct orientation is known.

The coordinates for traveling back found are shown here:



As described earlier, for the coordinates to be the correct ones for getting the truck properly back to the base area each time, the truck has to have the same orientation after having turned about in front of the container, as it had when finding the coordinates in the first place. It was tried to solve this problem by ‘prolonging’ the black line as described. But after having found the coordinates empirically it was clear that this wasn't enough.
To solve the issue it was decided to be necessary to either:
a) make sure the truck ‘enters’ the colored unloading area in the same direction and orientation every time and then turns exactly 180 degrees (somewhat what was implemented when considering this), or
b) make sure that the truck turns a variable amount of degrees when it reaches the colored unloading area until it reaches an exact orientation that is should be able to turn to every time.


Seeing as it is difficult to make sure that the truck has the exact same orientation every time it observes the colored area (as previous efforts to eliminate this problem could not do this completely) we tried implementing option b, which meant changing the behavior of the turnTruck method. The way this method turns the truck is now to first turn 90 degrees (counterclockwise for colors red and green and clockwise for colors blue and yellow), and then keep turning until the lightsensor sees the color black. This means that:
i) for colors red and green: it has seen the left side of the black line that it followed to get there
ii) for colors blue and yellow: it has seen the right side of the line that it followed to get there.


An illustration that shows the rotation direction and illustrates what part of black tape the truck stops at is below.



Both these should result in the truck having an orientation fairly perpendicular to the container, which also meant that the M&M actually would roll into the container when the trucks bed is lifted up.
From experimenting with this implementation it was found that the truck got too far into the black line or too far from it before stopping the turn, depending on which color container it had traveled too. To solve this problem, experiments with different values of the power for the motors were made, adding a time delay from seeing the black line to actually stopping and adding black tape to parts of the black line for the light sensor to register black earlier on the turn. The way it finally worked was this:


The method checks if the variable MnMColor is red/green or blue/yellow. The truck rotates counterclockwise when red/green and clockwise when blue/yellow. It first uses the tachocount to rotate about 90 degrees. Then it continues rotating and starts using the lightsensor to check if black has been spotted. If it has, a delay - which is individual for each of the colors - is performed before the motors are stopped.
The black line on the track to the red and green container has been modified on the left side of the line (the side that the light sensor doesn't follow) with some extra black tape. This was because the sensor needed to register black earlier on, in order to stop at the correct orientation to the container, and one obviously cannot tell the truck to stop before seeing black.
A part of the turnTruck method (for the red/green colors) is shown below.


A picture that shows the marking of where this tape should be for the truck to have the correct orientation and that also shows the implementation of a modification is shown below. To finely tune the orientation, some small delays were inserted after having seen black, to get an even better orientation (this was easier than to move the tape around).

No extra tape was added to the lines going to the blue and yellow containers. Here, simply using delays, again tuned to get the correct orientation before stopping the motors, gave the sought for outcome. This is due to the direction which the truck rotates, and the positioning of the light sensor on the truck.
A video that shows how the truck behaves with this tape inserted for the line to the green container is shown below.

Although the truck should end up in the correct place with the correct orientation through all these maneuvers described, a lot of things impact the trucks possibility to end up this way. One of the issues here is the traction between the back wheels and the track, because the back wheels sometime create a lot of friction because of the way they are turned around/dragged behind the motorized wheels. We describe further down how we tried to remove these issues as successfully as we could, but it turned out that we couldn't obtain the exact correct position and orientation for the truck after returning to the base that was needed (in order for the coordinates used in the next run through the main method to be correct). We solved this by implementing a touch sensor (connected to the dispenser stations NXT), that doesn't allow the dispenser station to rotate the conveyor belt - even though the truck has sent the signal that it has returned - before the touch sensor has been pressed. This allows us to manually position the truck correctly before pressing the touch sensor and the system can perform one more run.


Optimizing the linefollower with PID
As described the implementation of the line follower has been changed throughout the project, from using the Motor class to using the NXTRegulatedMotors, until it was found possible to go back to the Motor class. The simple line follower using the Motor class would only check for black or white, and then turn respectively left or right with a static speed.
However it was preferable to be able to follow the line more precisely, in order to reduce the risk of erroneous following and getting a smoother line follower. It was therefore looked into implementing a PID controller. A custom car was built, because the delivery truck was being used to empirically find coordinates etc.
A picture that shows the custom car can be seen below.


Most of a day was spent trying to implement a good PID controllerfor the car, using course literature [10] among others, for inspiration. But for some reason the efforts always came up short. Trial-and-error was used, in addition to searching the internet for solutions - even finding that lejos has a built-in PID controller class - but no attempts was even close to make the car follow a straight line. At the edge of giving up, the idea came that maybe it wasn't the code, but the construction that was the problem. Looking at the car two potential problems with the construction were found:
1) First off, not enough weight was on the wheels, which could result in wheelspin at high accelerations.
2) Secondly, and most importantly, the realization came that having the light sensor that close to the cars center of rotation resulted in even small value changes could have a major impact on how the linefollower would behave. The car was then reconstructed to put more weight on the wheels, and to extend the distance from the lightsensor to the cars center of rotation, as shown below.


Around half an hour later a useful PID controller had been constructed. It was inspired by a user web page from Humboldt State University [11], where he utilize lejos’ built-in PIDController class. However it was chosen to copy and customize the PIDController class instead, in order to be able to read the error variable and to edit the integral variable. The values we used for P, I and D were found through trial-and-error.

When implementing the PID in the truck, it was found that even though the physical difference between the custom car and the delivery truck was small - speaking of wheel distance and light sensor position - the delivery truck acted significantly different while driving. The custom car was modified so that its wheel distance was precisely equal to the trucks wheel distance. To match the two cars' driving ability it was necessary to modify the values for the proportional, integral and derivative arguments (the P, I and D values), and adjust the trucks sensors positions slightly. Furthermore it was discovered that sometimes the truck would come perpendicularly onto the line it had to follow, resulting in the PID making it understeer and drive to the other side of the black line, and thus see white again and drive off. To take this into account, it was decided to instead chose to let the car slowly turn left to approach the black line, and once it reached it, to drive a bit into it. Afterwards the car would slowly turn right to approach white area, and then start the PID controller once it had reached white. This way it was ensured that the trucks orientation would be much more straight along with the line than previously, and thus the start point with the PID controller was much more optimal.



3) Working with the dispenser station
The first challenge with the dispenser station was to make a construction that only takes a single M&M at a time. We found out that there are other people who have worked with similar problems, so research was done online to see their solution(s). The following videos were found interesting:

However, a large difference between most of the research and this project is that they sort balls with a static size and shape. The M&Ms used in this project, vary a lot in size and shape, and sometimes their surface become a bit sticky due to the sugar coating. The size and shape-problem was reduced by manually sorting out the large, small and odd-shaped M&Ms, using the survival of the fittest-method (litterally) - by consuming those sorted out M&Ms (sometimes you just have to make a sacrifice for the sake of science).
Various solutions were experimented with to solve the problem of getting one M&M at a time.
One of our attempts was to make a funnel where the small end could just output a single M&M. However it turned out that the stickiness of the M&Ms would make them stuck near the end due to the shallowness, and it was thus discarded. Another attempt was a sort of catching shovel, where the M&M would get scanned for color when it was in the shovel. When the M&M was scanned, the shovel would then open and let the M&M drop down on the chute. However it proved very difficult to get the M&Ms one at a time with this solution, so it was discarded as well.

In the final construction the dispenser station was perfectly able to pick one M&M at a time. This was done by first off making a new funnel with a hole just large enough to contain M&Ms in a row one by one. We were informed that it would be OK if we had to manually shake or pry the M&Ms loose in case they would get stuck due to their stickiness. Second off the dispenser station utilised the 5x1.5-module belt (ID no. 6014648) and sprockets (ID no. 4582792) from the lego EV3 core set [15]. These parts are shown below.














Furthermore, "boxes" were constructed on it, using other LEGO parts, that could contain just a single M&M. Pictures of this is shown below.



With this solution the dispenser station was able to deliver exactly one M&M at a time, every time.


Color Sensor
The color sensor was needed to read the color of the M&M and remember its location in the conveyor belt. This was at first done using an arrayList to hold the scanned colors, adding the just scanned color to the end of the list and removing the first element on every run. The color sensor was positioned like the picture below shows.

Behavior based control, as used in lesson 10 [16] was used to make the conveyor belt operate, to take up the challenge. A behavior system was implemented in the following way:
The Onward behavior moves the conveyor belt,
the DetectColor suppresses Onward when the tachocounter says that it has rotated 107 degrees, DetectColor then resets the Tachocounter, and scans the M&M’s color, and release back to Onward to start the conveyor belt again.

When the dispenser starts, the conveyor belt turns, the color is scanned, and the scanned color is stored in the array. This runs in a loop until the currently scanned color on the conveyor belt is different from black. When that is the case, the loop is stopped, and waits until the truck reports that it is back and ready to get loaded. When that happens, the loop runs again, and the truck drives off to unload the M&M to its corresponding color station.
However, it was preferred to iterate further, and take into consideration that it is possible that several M&Ms in a row may have the same color. If so, then they should all be loaded together on the truck instead of being driven out one at a time. This proved difficult with the current code being behavior based and led to several problems. Some with regards to program logic and some regarding challenges in implementing the behavior system. There were some instances where the conveyor belt removed all of the same color, except the last one. This and other issues seemed to stem from the fact that an array was used to hold the colors. This realization made it necessary to move the color sensor closer to the drop off, and then not have to use an arraylist, as shown below.


The code needed to make the system run became very simple after the iteration. In fact, it was so simple that we realized it had been made strictly sequential. One behavior checked the color of the M&M, and if it was not the same as the previous one, it would set a boolean true, which another behavior used as a trigger. That behavior would then transmit the color to the truck, and start waiting for the truck to transmit that it had returned. When that happened the method released and the main behavior then took over again. This worked very well, and the system was able to deliver the right amount of M&Ms every time. So because of this apparent sequential behavior, it was therefore decided to move away from the use of the arbitrator in the behavior-based control, to a program using sequential control, which worked.

Before running the program on the NXT the conveyor belt should be rotated to a position where the color sensor is straight above a "box", and not between boxes. This also means that the M&M will fall of the conveyor belts "box" and into the trucks bed.


Connecting the dispenser station to the delivery truck
To start out with, the delivery truck and the dispenser station were developed individually. However it was kept in mind that they should be able to communicate together, so they were prepared for that during the development. As previously mentioned, while working with the dispenser station, a stripped NXT - named DriveIt - was used as stand-in for the delivery truck, making it possible to work with and test the bluetooth connection.
A few problems were encountered when trying to connect the dispenser station with the delivery truck. One of the problems was that in the program that was to run on the computer: Remote.java [17] it was assumed that the line if(DriveItDis.available() == 0 ) - DriveItDis being the datainputstream from the NXT to the computer - in the while-loop would only trigger if the device had actually sent something to the computer.
However the if-sentence turned out to trigger every time the while-loop ran, even though DriveItDis.readInt() had nothing to read, and thus would block the rest of the code from running until something was read from DriveIt NXT. Even though not optimal, it wasn't a critical problem either, as the behavior of the whole project is sequential, and thus Raphael (the dispenser NXT) should not send something before DriveIt had sent something, and vice versa. To start with, we had the computer first listening to DriveIt, and then listen to Raphael when DriveIt had sent something. It was though found that the logical order would rather be listening to Raphael first, since it is known that the delivery truck is available from the beginning, so we changed the code accordingly.
The code snippet for the final while-loop is shown below.


Also a problem was that somehow the dispenser would send the wrong color to the delivery truck. It was found that this was because that we had set previousColor to be equal to currentColor no matter what, which also meant that previousColor could be 7 (black) - meaning that the conveyor belt is empty - which would interfere with our if-else sentence. This was fixed by a simple if-sentence in the end that would only update previousColor if currentColor was a value different from 7.
Another problem was that the truck would receive the integer of the color red as the first integer every time. It was then realized that the integer sent from the dispenser station was initialized as null, which is 0 in integer (and red is 0 from the color sensor), and thus it would send 0 to the truck. This was fixed by initializing the said integer as 7, which is black, and then the code would automatically take that into account.



Conclusion

We got our M&M sorting logistics factory to work and run almost autonomously. The M&Ms were sorted correctly every time as long as the truck was manually righted after the return.
The behavior of both the delivery truck and the dispenser station has changed throughout this project, and where we sometimes thought that we had found the best way to solve some issues, they later turned out to be better solved differently. This was either because we learned new things along the way, or because other issues interfered and required us to change affected methods accordingly.
All the code to run the project and code used for calibration is available here: https://drive.google.com/folderview?id=0B7Jrxb5izHr8b3NLSXJqU2FBM00 [18].


Future work

More time to work on the project would have allowed for a solution to make the project fully autonomous, by getting the truck back to the dispenser station in a way that insured that it had reached the correct orientation and position, for example by use of other motors to forcefully push the truck in place. To eliminate the need of having the truck being placed exactly correct every time it got back, another method for the truck going to the correct line to follow could be implemented. This could be a method that first searched the area in front of the car for the color of the M&M it carries, and then following lines that were this color instead of a black line on white paper.
During our presentation we learned that the speed of the truck could also have an impact on the precision of the navigator. Additional testing could possibly have led to more precise navigating.
Additionally we would also like to improve our PID further, as it would sometimes oscillate while following a line. Furthermore improving the positioning would also remove the necessity of having to 'find the line' first before starting the PID controller fully.

Even more time would have allowed us to pursue some things to expand our project with.
First off we would have liked to have some sort of device to bring a person some M&Ms with a chosen color. This could either be a car which would have a shovel or some sort of grappling device, which would drive from a start position to a container, and grab some M&Ms. If the collector should be another car driving around the track, it would have to take the delivery trucks position into consideration, so the two devices would not ram into each other.
Another solution to bringing M&Ms to a user could be some sort of crane like a plush toy grabbing machine, where either the user could manually control the crane, or the user would select a color, and then the crane would bring it.
Second off we would have liked to improve the dispenser station, such that it could keep sorting colors even though the truck was out delivering. For this we would have the dispenser drop the colored M&M into respective small containers, and then the truck should be able to somehow empty the most full container into it's truck bed, and then deliver to the large container.




References


[0] Project video: http://vimeo.com/98587382

[1] lab report lesson 5: http://driveit-lego2014.blogspot.dk/2014/03/week-6-lesson-5.html


[2] lab report lesson 3: http://driveit-lego2014.blogspot.dk/2014/02/week-4-lesson-3-notebook.html


[3] Fred G. Martin, Robotic Explorations: A Hands-on Introduction to Engineering, Chapter 5, pp. 179-190, Prentice Hall, 2001


[4] NXTRegulatedMotor class methods: http://www.lejos.org/nxt/nxj/api/lejos/nxt/NXTRegulatedMotor.html#suspendRegulation()

[5]  lab report lesson 9: http://driveit-lego2014.blogspot.dk/2014/04/week-10-lesson-9.html

[6] Wheeled Vehicles litterature: http://www.lejos.org/nxt/nxj/tutorial/WheeledVehicles/WheeledVehicles.htm

[7] CalibrateDP.java: https://drive.google.com/file/d/0By0ku5y7gpWyZl9HZE5jU2NMMlU


[8] BTctrlTruck.java: https://drive.google.com/file/d/0By0ku5y7gpWyTll2WkZzWXR4Ym8


[9] RConsole class: http://www.lejos.org/nxt/nxj/api/lejos/nxt/comm/RConsole.html

[10] Inspiration for PID controller: http://www.inpharmix.com/jps/PID_Controller_For_Lego_Mindstorms_Robots.html

[11] Line follower example: http://users.humboldt.edu/aschmidt/linefollowv3.php

[12] Fast colored ball sorterhttps://www.youtube.com/watch?v=t0Ls-x26tWU

[13] Sorting machine - Skittles and M&M’s: https://www.youtube.com/watch?v=H7HTQai7Wwg

[14] Lego mindstorms M&M sorting machine: https://www.youtube.com/watch?v=mjo8n9HvenE

[15] EV3 set: http://shop.legoeducation.com/gb/product/lego-mindstorms-education-ev3-core-set-45544-198

[16] lab report lesson 10: http://driveit-lego2014.blogspot.dk/2014/05/week-11-lesson-10.html


[17] Remote.java: https://drive.google.com/file/d/0B7Jrxb5izHr8ZFdUM3lUaENZTjQ


[18] All programs: https://drive.google.com/folderview?id=0B7Jrxb5izHr8b3NLSXJqU2FBM00