torsdag den 20. januar 2011

Journal for final project in LEGO Lab

Embodied Agents Evolution
Journal for final project in LEGO Lab.


Lab session 1
Date: 02-12-2010
Participants: Knud, Nikki & Amos
Duration: 11-18 (7 hours)

Goals:
Specifying the project in detail. Checking for robot availability (we’ll need a group of robots). Create a first draft of the project plan.
Creating basic behaviour patterns:
  • Avoid Clash

This is needed so that the robots can register an object (wall, other robot) in front of them and avoid running into it.
  • Find food

The robots need to be able to find a food source within 3 meters.
  • Eat food

To detect that it can find food and actually eat it. Its behavior that if done properly
contributes to make the robots look like real creatures.

Plan for achieving the goals:
We will ask Ole Capriani for availability of extra robots.
We will create the basic behaviours by using our experience/code from prior lab sessions.
  • Avoid Clash

Test if the SonicSensorTest.java from prior lab session can be used (reference to lab session and code)
  • Find Food

Test if it is possible to use the Lightfinder [1] based on Braitenburg vehicls [2] from prior lab session, and test if it is usable in daylight and how far away it can detect and move towards light sources.
  • Eat Food

Modify the playsounds behavior to play sounds and making a shaking “eating food” motion depending of whether the color sensor detects that the food area (green surface) is reached.
Results:
Checking robot availability:
Two extra robots where handed out. More can be obtained if needed.

We chose the project we wanted to work on and made three basic behaviours the robots should act on. The first behavior was to avoid clashing into objects in front of it. The second was to be able to detect a certain color on the surface the robot is placed on. And the last is to be able to find a light source. These basic behaviours will be combined in a robot so that this robot will look and act like animals looking for food and possible mating partners.

Avoid Clash:
This feature is obtained with the Ultrasonic Sensor available in the Mindstorms Bulk. The distance it reacts on is estimated to 20cm. And using the SonicSensorTest.java programme (The one from the exercise in lesson 2) as a starting point the robot was ultra sonic sensor was set to react to 20 cm. This was obtained by testing on the value from the function getDistance() from the UltrasonicSensor. class available in the nxt library. The function returns an integer that equals the distance in cm to the nearest detectable object. By making a simple test with the vaue 20, it is possible to stop ca. 20 cm from an object. The behaviour we hoped to develop for the robot was to make it stop and then turn away from the object it is about to run into. This required a bit more space around the robot when it stops, and therefore the distance to the nearest object was changed to 25 cm, enough room for the robot to make a turn.

Find Food:
It was possible to use the Lightfinder [1] in daylight to detect and move towards a light-source. It was also tested if putting lego blocks around the light sensors to shade from ambient light would improve the direction accuracy. In our tests it did not improve significantly. In the test room lighting conditions, it was able to find and navigate towards a bicycle light from around 3m.
We thought about implementing a calibration function, as the different light sensors sensitivity has a bit of variance. But the variance made the robot turn in circles when it was not pointing towards a light-source. This behaviour is useful to finding the light-source when not pointing towards it, so we did not implement a calibration function.

Eat Food:
The behavior was made such that when the robot detects with the color sensor that it is over a green surface, it goes into food behavior which is shaking and making sound. It was not implemented though, how to make the robot detect that it is full and don’t have to eat anymore.
The idea here is to use a counter or a timer combination, that increases to full level when the robot eats and then a some rate decreases when its not. Then certain thresholds will define how hungry it is.

Conclusion:
We were happy to have tested some the basic behaviours so that we knew what would work in a daytime environment at the location. By having these simple behaviours as building blocks to create embodied agents for our Evolutionary Robot project, we could look into the more difficult tasks at hand, such as finding and identifying other robots, getting them to behave like creatures living within a confined area(their own little world).

References:
[1] Lab 8 - http://nikki-and-knud-labnotes2010.blogspot.com/2010/11/nxt-programming-lesson-8.html
[2] Braitenberh Vehicles - http://www.cs.brown.edu/people/tld/courses/cs148/02/introduction.html


Lab session 2
Date: 09-12-2010
Participants: Knud, Nikki & Amos
Duration: 10:30 - 13.30 (3 hours)

Goals:
Figure out how to do mating detection and mating communication between embodied agents. Start on a form of monitor program to monitor the state of all the robots

Plan for achieving the goals:
Mating Call/Detecting/Communication
Research if sensors are available for IR-link detection and communication
Research alternative methods for detection (each robot should be discernible from the others by the mating signal) and possibility for using BT-link for communication.

Monitor Robots:
Research if there exists some form of Lejos API we can use for monitoring the robots over bluetooth.

Results:
Mating Call/Detecting/Communication Method:
We have looked into several different options to detect mating ready robots. One option was to use a IR-link. The IR-Link could be used as both detection and communication. This could be done by making a IR connection and the fact that the robots where close enough to establish the connection is an indication that the robots are mating ready.
We found two IR-transmitters [1], [2] but was not able to find a IR-receiver.
This option was not possible to implement within the desired time frame because there’s no IR sensor available which can work as a transmitter and receiver. Furthermore, we did not have any more sensor input ports available on the robot.

We discussed the possibility of using the NXT Bluetooth for communicating the genes between the robots. It is not possible to keep a connection between the NXT and a server for monitoring while the NXT connects to another NXT. So because we want to monitor the robots all the time, we came to the conclusion that the communication of genes should go through the server.

For detection we discussed is using bumpers on each side of the robot to detect when a robot has run into something. The problem here is that positioning information is needed to make sure that the robot you have bumped into is the robot you are mating with. The robots need to be able to distinguish themselves from each other. One could imaging two robots bumping into each other, and a third robot running into something else close to them. This could end up with a robot mating with a robot that is not the nearest robot.
This option seemed too difficult, because of the added complexity of positioning information needed, and the extra input sensor port needed

We also discussed the option of having the robots emitting mating calls/sounds, and then have a microphone detecting robots close to them that where ready for mating. This could be obtained by having each robot bursting out e.g. morse code like calls/sounds, and then have other robots detecting these. This would require some simple signal processing, but also an additional microphone sensor. But since there is not many sensor ports available, we chose to go with a option that in its nature looks a lot like the audio calls but do not require an additional sensor.

We chose to go for light emitting mating signaling, this would enable us to reuse the light sensors already used in the behaviour Find Food. To distinguish robots from each other we will be sending out coded sequences of light from each robot to identify themselves. This is done by connecting a diode to one of the motor output ports on the nxt and then coding a class to make the coded signaling.

Monitor Robots:
We looked into the existing BT communication already available in lejos, and found that it had some short comings that made us develop a communication protocol of our own. The LCP functionallity provided in lejos enabled us to monitor the robots in the nxjmonitor[3] program.
The problem here was that we were only able to see the status of the robots in seperate windows (one for every robot), and we would not be able to send other commands to the robots than the ones already implemented. What we needed was to be able to send and receive information about values in our code. This is not supported in the LCP library. Also there is a RConsole which can be used to send messages to a PC from the NXT. The problem here is that we do not have the oppertunity to send messages from the PC to the NXT.

Conclusion:
Mating Call/Detecting/Communication Method:
We researched different options for detection and communication of mating. We concluded that the best option, with the time and sensors available to us, was to use blueooth for communicating genes via a server (maybe through the monitoring server program), and using existing light sensors and blinking diodes for mating detection, with a unique detectable blink for each robot.

Monitor Robots:
We are going to implement our own bluetooth connection to the robots, as the existing protocols for communicating to the robots is insufficient to our needs. Both the RConsole and the LCP are only usefull in monitoring the NXT, either of them can be used to receive messages or values from the PC to the NXT. We spent a lot of time implementing first the LCP in our code, only to discover that it did not support sending information to the NXT. We had a quick glance at RConsole, but luckily we found that it was pretty much the same story as with LCP and therefore stopped exploring this option. We ended up agreeing on making our own bluetooth protocol to make a to way communication with the robots and being able to connect to more than one robot at a time. To monitor the robots we would also make a monitor program to run on the PC and this should display the data sent and received by all the robots in a single screen.

The LCP has got the same nature as the RConsole, it can monitor the NXT’s but not receive messages to be used in the code. To use LCP the PC and NXT must have been paired as well, but the LCP offers a more comprehensive set of monitoring tools. The window to use is the nxjmonitor and this will open a screen containing both a console line in which inserted text lines can be printed and a monitoring of all the output ports and sensor ports on an NXT. If more than one NXT is to be monitored a nxjmonitor window for each robot must be opened.

References:
[1] IR Link - http://shop.lego.com/ByTheme/Product.aspx?p=MS1046&cn=17
[2] NRLink-NX - http://www.mindsensors.com/index.php?module=pagemaster&PAGE_user_op=view_page&PAGE_id=59
[3] nxjmonitor - http://lejos.sourceforge.net/nxt/nxj/tutorial/PC_GUI/PCGUITools.htm#8


Lab session 3
Date: 11-1-2011
Participants: Knud, Nikki & Amos
Duration: 12.00 - 01.00 (13 hours)

Goals:
  1. Create the environment for the robots.
  2. Create the monitoring program.
  3. Create mating call/detection.
  4. Implement motivation network and add all behaviors
  5. Assemble robots and test them in the environment.


Plan for achieving the goals:
Monitor program:
Try out tutorials from lejos homepage to achieve the proper bluetooth communication. Look at RConsole.

Mating call/detection:
In the last lab session we concluded that the best way to do a mating call is with diodes blinking in a unique pattern, and for detection of the mating call use the existing light sensors to detect the blinking.

Motivation Network:
As written in the project description, we will base the robot architecture on a simplified version of the Motivation Network [2].

Environment of the robots:
Make a simple arena with a clearly marked food area.

Assemble Robots:
Tryout different builds with the sensors need for the behaviors.
The sensors needed:
  • Colorsensor
  • Sonicsensor
  • Lightssensor


Results:
Monitor program:
RConsole fits more as an debugging tool in which you can insert debugging message to trace errors in your code. Instead we will look into implementing our own bluetooth communication using the bluetooth library available in the underlying linux os.

Mating call/detection:
The mating call was made by having 4 lights in front of the robot blink with a frequency unique for each robot as concluded in labsession 2, the detection of it was inspired by the soundclap concept in lesson 3 [1] to detect claps, where the concept is to detect whether the sensor value is above a certain threshold and check the time that is over it is within a certain value of milliseconds as well.
We made 3 unique blinking patterns, one for each robot:
  1. Diodes on for 100ms, diodes off for 100ms, repeat
  2. Diodes on for 200ms, diodes off for 100ms, repeat
  3. Diodes on for 300ms, diodes off for 100ms, repeat

We were able to detect and distinguish the patterns from each other with the light sensors. They were able to detect it from a angle of up to around 150 degrees and from distance of up to around 30cm

Motivation Network:
We implemented the Motivation network based on the lejos behaviour architecture, as suggested in Lab 10 [3]. We changed the takeControl method of each behaviour to return a motivation factor - an integer from 1 to 100 instead of a boolean. We changed the decision maker to run through the list of all behaviours and choose the one with highest motivation factor. If two behaviours has the same motivation, the behaviour placed last in the list has the higher priority.

Environment of the robots:
With green food area, the idea is that when the robot is looking for food and enters the green area then it starts the eating food behavior. To direct the robots towards the green area we placed a bicycle front light in the corner.
Floor plan of the arena
Green feeding area
Our light source

Assemble Robots:
3 Robots were built with a setup shown on the following pictures:
  1. Light sensors for FindFood behavior and DetectMatingCall behavior.
  2. Blinking lights to create mating call.
  3. Sonic Sensor for AvoidFront behavior.

The height of the tail was increased in order to make the sensors and lights on the front be more horizontal:
ColorSensor for the detecting food area (green paper) marked by red circle:

Conclusion:
Monitor program:
The monitoring program proved to be a more difficult affair than the first expected because getting bluetooth communication to work in two ways communication.
How the RConsole works:
On the NXT you have to include the following line to establish connection with an NXT.
RConsole.openBluetooth(10000);
If there is no value passed to the function it will wait forever for a BT-connection. During this wait one should turn on the nxjconsoleviewer, and the bluetooth connection will be established (Provided that the NXT and PC have been paired so that they “know” each other). When this connection is up and running you are able to send messages from within your code by adding this line.
RConsole.println(“Code has reached the RConsole print line.”);
This will be printed out in the nxjconsoleviewer-window.
Since we needed to both send and receive custom information (different states and genes) between the robots and the server, we choose to use the default bluetooth serial port profile.

Mating call/detection:
We succeeded in creating a unique mating call for 3 robots, and the ability to detect them from each other. If we need more robots and thereby more unique mating calls, we will need to implement another type of blinking pattern, as we can’t keep setting the delay to a higher value. It will increase the time it takes to detect the pattern, and thereby decrease the chance of detection as the robots move about all the time when searching for a mate.

Motivation Network:
We succeeded in implementing a form of Motivation Networks.

Environment of the robots:
A basic environment was created from one of the basic arenas provided in zuse lab.

Assemble Robots:
A decent build for the robots was achieve however the light sensors were not mounted robust enough and would often fall when to robots drove into the arena wall from different angles.

Yet to do:
  1. Improve RandomDrive behavior to make it more continuous.
  2. Improve the robustness of the sensors with lego brics.
  3. Implement a global motivation or energy level for each of the robots, controls when to look for food and when to look for mating.
  4. Implement the exchange of genomes / the actual result of mating.
  5. Finish monitoring program.


References:
[1] Lab lesson 3 - http://nikki-and-knud-labnotes2010.blogspot.com/2010/09/nxt-programming-lesson-3.html
[2] Thiemo Krink (in prep.), Motivation Networks - A Biological Model for Autonomous Agent Control (http://legolab.daimi.au.dk/DigitalControl.dir/Krink.pdf)
[3] Lab lesson 10 - http://nikki-and-knud-labnotes2010.blogspot.com/2010/11/nxt-programming-lesson-10.html

Lab session 4
Date: 12-1-2011 & 13-1-2011
Participants: Knud, Nikki & Amos
Duration: 12 - 7 (19 hours)

Goals:
  1. Implement an energy level for each of the robots, controls when to look for food and when to look for mating.
  2. Implement the exchange of genomes / the actual result of mating
  3. Finish the monitoring program and get communication between PC and NXT working through a bluetooth connection and make a GUI for it.
  4. Additional testing of the mating call/detection in the actual environment


Plan for achieving the goals:
Internal States:
Testing motivation functions on a robot and seeing how it changes behaviour with different gene-values. Introducing genes in the InternalStates class, which holds the current state of the robot.

Monitor Program:
Use Netbeans to create a GUI, since its Nikki’s experience that Netbeans has a good GUI editor.
Implement the monitor program by using the basic bluetooth spp with streams, as we still have nothing working.

Exchange of Genomes:
Implementing a protocol for exchange of genomes through the monitor program, since it will always have a connection to all robots.
The robots should be able to send genes and the name of the mate detected to the monitor program. The robots should also be able to receive new genes from the monitor program. When new genes are received it should replace its own genes with the received genes, reset age and energy level and increment the generation count.
When the monitor program receives genes, it should get the genes from the mate. For each gene pair it will randomly select one of them. As all robot genes start the same, a 5% chance of mutation exists, this will enable different genes and thereby different behaviours. After the new genes are selected, the monitor program will send them out to both robots.

Testing of Mating call/detection:
Setting some robots up and running them in the environment. Tweak the setup so that we get the desired behaviours from the robots.

Results:
InternalStates:
The motivation functions worked fine, we tried out giving the robots different gene values and see if their behaviour changed. We saw that the robots with high metabolic genes where at the feeding ground more often than the robots with a small metabolic gene value, and we also saw that robot with a high reproductive gene value where quicker to become mating ready after having eaten at the feeding grounds. This was as we expected, and one of the important steps towards making the evolutionary robots was finished.

Monitor Program
We used Netbeans to make simple yet powerful monitoring GUI on the server PC.
We succeeded in finishing the monitor program, by using the bluetooth spp and streams. The communication is done by normal text, with a linefeed signalling end of command. The robots sends a command “REPORT” followed by all the reporting information and a linefeed.
The monitoring of the robots also included setting the genes on the robots. Further details follows below.

Exchange of Genomes:
We implemented a command called “GENES” in the Monitor protocol. If the command is sent from a robot, it is followed by the name of the detected robot. When this is received in the Monitor Program it will select the genes as described in the plan, and send the command “GENES” followed by the new genes to both robots (the robot that sent the command and the detected robot).

Testing of mating call/detection:
When we set a couple of robots free the first time we could see that they behaved as we expected, but they did not go into mating mode often enough for us to register it, so we changed a bit on the motivation function that controlled this behaviour and this made the robots seek each other more.
There were problems with getting the robots to detect each others mating calls because of the AvoidFront behavior overriding when the robots are to close to each other. And when the robots were driving around they moved too rapidly with rapid turns to get a long enough pause in the driving that the blinking lights would be detected often, however they did succeed in detecting each other on a few occasions.

Conclusion:
InternalStates:
The InternalStates ended up containing both the genes and the state of the robot, including functions that the behaviours could use to set and read the values. The InternalStates also holds the motivation functions for the two genes, this is the calculation of the motivation for doing a specific behaviour. To include a new gene would include making a motivation function in the InternalState and then use this function in the takeControl method of a behaviour for it to affect the likelihood of this behaviour to take control of the robot.

Monitor Program
The GUI was finished separately in Netbeans. The Monitor program was made in Eclipse, so we still need to merge these two together.
We ended being able to monitor multiple robots on the server. All information about the robots is printed in the server console window. Some of the bluetooth functions were not implemented or only half working, this took a lot of time to figure out, since it was not documented in the Lejos documentation

Exchange of Genomes:
We ended up with exchanging the genomes over the BT-connection, this was needed as the NXT’s where not able to obtain a BT-connection to both the PC and another NXT at the same time.

Testing of mating call/detection:
We had a hard time making the NXT’s find each other in order to mate. We did not have a mating behaviour that contained the position of the robots relative to each other, so we relied heavily on chance that the NXT’s would find each other.

References:


Lab session 5 (incl. presentation )
Date: 13-1-2011
Participants: Knud, Nikki & Amos
Duration: 14 - 15.30 (1½ hours)

Goals:
Finish the GUI for the monitor program
Present Project
Receive project feedback and evaluation from the presentation from Ole Caprani

Results:
GUI
We created the GUI in Netbeans for the monitoring program where each robot have its own section with information of all its internal states and motivation factors for each behaviour.

We had problems getting bluetooth to work, when compiled with Netbeans, therefore we were not able to connect the GUI to the information received from the robots. Instead all the information was printed to the console of the server. Not pretty, but it worked and gave us the information needed to monitor the evolution of the robots.

Evaluation


Summary of the project Evobots
Evobots was inspired by the project of Sex Bots/Embodied evolution and specifically by Krinks articles [1] [2] where agents evolves with exchange of genomes and then the behavior of the agents evolves in a certain way as the robots reproduce and the strong genes survives and the weak ones die out.

We initally implemented the following basic behaviors:
  • Randomdrive
  • AvoidClash
  • Find Food
  • Eat Food
  • Find Mate
  • Mating


An energy level slope was also implemented. Genes was based on the energy level, to determine the motivation for when the robot would look for food or mating.
Later some of the behaviors was clashed together in a multibehavior behaviour because of the arbitrator not being able to run 2 or more behaviors at the same time. We ended up having light sensors to detect both food area, and other robots. To monitor agent status and distribute genes we used the bluetooth spp (serial port profile) for communication and made a simple protocol customized for this project.

Listed here is some of the things we could have made different in parts of the project.
Genes:
They only had effect on the motivation for finding food or finding a mate. And even though we could see different behaviour patterns for robots with “strong” and “weak” genes, this could have also been made more evident by having a gene that effected the speed by which the robot could move. This would have given an extra dimension to the robots and perhaps made it easier to prove a fitness goal.

Detecting a mate:
We hade problems detecting a mate when they were moving around, we discussed the following alternatives for detection.
Colors:
A better way of detecting a mate would have given us a better picture of how genes would evolve over time. One idea could have been to mark each robot with a distinct color and then use a color sensor to distinguish the robots from each other. This would limit the number of robots we could set in the arena to the amount of colors available to coat the robots and the amount of colors the sensors could distinguish. The advantage of such a setup could be that the area of detecting the color would probably be larger than the area on the robots flashing diodes. And with this setup there is no timing issue as with the flashing diodes.
Alternate mating behaviour:
Light:
We could have place the blinking lights on top of the robot and made them omni directional, thereby making the direction of the signalling robot irrelevant for detection.
Position:
If we implemented some kind of positioning technique, the server would know when two mating ready robots were close to each other. The server could send out a command to the two robots so they would know they were close and change their movement behaviour accordingly. If the positioning also included direction, the server could send out the direction each robot was facing, making mating detection even easier.
Movement:
Implementing the mating behaviour so that the robots stopped moving around randomly when close to each other would also increase our chances of getting the robots to exchange genes, but this still depends on having the robots detect each others presence, and this is discussed in the suggestions mentioned above.

Fitness Goals
The fitness goals of the robots were to make the robots adapt eating and mating genes to a balanced state where the better genes would need less time to eat and therefore have more time to look for a mate and spread their genes. The bad genes would eventually die out because they would spend to much time eating and eating instead of spreading their genes.

Bluetooth
We had alot of problems getting the bluetooth to work a desired. First of we ran into the problem of getting the bluetooth to connect to the PC, we used a 64-bit version of windows to connect with a bluetooth dongle, but it turned out that it did not work. We looked for a long time to find the soultion for this problem, and finally remebered that you shouldn’t have Java 64-bit installed as this will conflict when trying to connect the PC with a NXT. Uninstalling the java and using the windows bluetooth stack made it possible to connect.
Furthermore we spent alot of time looking into the bluetooth communication, and in the end made our own implementation through the standard bluetooth spp. Alot of time could have been saved if we had been aware of these problems.

Conclusion of project:
The project proved to be more time consuming than expected, we did not manage to test the evolution of genes as we would have liked to, but the project work produced all the building blocks for such a project. All we needed in the end was time to test that only the “strong” genes would survive. To collect the data would involve setting up different starting points for the robots in the environment and then see what kinds of genes would survive. If we had time to do this a couple of times we would have a better picture of what genes where the most important in our environment. We where satisfied with the project results including:
- Making embodied agents that could survive in our little arena by using a motivation network
- Making a monitor program and a bluetooth protocol which could handle multiple connections and send/receive messages to the NXT’s.

Source code:
GUI in Netbeans 
EvoBot sourcecode