Main contents

Demeter robot

Field Robot Event 2006

Competition Web Page

In August 2005 a group of six students began planning the building of a robot that would take part of the Fieldrobot competition 2006 in Hohenheim. A similar team from the same universities had taken part in Fieldrobot competition in 2005 under the name Smartwheels. After sharing information with this team, it was concluded that building the robot should start from a clean table. This time, a considerable amount of effort was put into planning and designing the robot. Number of different design possibilities were evaluated, before choosing the current design. Great care was put into building the mechanics platform to perform as well as possible. This delayed the project and it was not until April that the first tests could begin. That and various unexpected problems kept the team occupied to the last minute.

Team members

  • Miika Telama (TKK, captain)
  • Jukka Turtiainen (TKK)
  • Pekka Viinanen (TKK)
  • Jari Kostamo (UH)
  • Ville Mussalo (UH)
  • Tuomas Virtanen (UH)

Team was supervised by researchers


The main task of the chassis of the robot is simple: It should be able to carry the measurement devices and provide the robot the ability to operate in field conditions. There are numerous different ways to achieve this goal and the selection of the technical solution depends on many factors. At least following properties of the chassis are desirable:

  • Reliability
  • Low cost
  • Good off-road properties
  • Good accuracy and ability to respond to instructions
  • Low power consumption

In practice the low budget of student project forces to some trade offs and all of these properties can not often be fulfilled in the desired way. However, it should be kept in mind that the chassis of the robot is the base on which the all the sensor and computer systems are built. It is obviously essential requirement that the chassis is able to drive all the equipment to the position defined by the controlling computer.

The chassis for the robot is based on a Kyosho Twin Force R/C monster truck. Building began with the original chassis which needed some modifications, but eventually a completely new chassis was build. Only the tires, axles and some parts of the power transmission were kept in the final version of the robot.

The chassis is newly built with self-made parts with no previous drawings, though based loosely to the original Twin Force parts. Aluminium was chosen because of its lightness. The engine rooms have been separated from the rest of the chassis and the microcontrollers have been placed into a separate slot to allow easy access. Chassis is four-wheel-driven and each one of the engines powers one axle. All of the springs have been removed, and instead there is a large joint in the middle of the chassis providing the robot manoeuvrability somewhat similar to farm tractors.

The vehicle is electrically driven with two Mabuchi 540 motors. The motors are driven by two PWM amplifiers. The chassis didn't originally have four-wheel-steering, so it had to be build with optional parts provided by Kyosho. Four-wheel-steering was necessary to give the robot enough turning radius. Two HS-805BB+ MEGA servos were used for steering and they seemed to provide enough steering power for the robot.


Four Devantech SRF08 ultrasonic sensors were used together with a Devantech CMPS03 I2C compatible compass. The sensors were connected to I2C sensor bus. For machine vision the robot has a Logitech QuickCam Pro 5000 camera attached to a carbon fibre pole with a servo motor to enable a turn of 180 degrees. The compass was placed to the camera pole, high enough to avoid magnetic fields that would create interference with it.


The robot has two microcontrollers: ATMEGA 128 and a PIC 18F2220. The first one was used to control the Msonic power controllers for the DC-motors, steering servos, and the camera-servo. The second one was used to collect sensor input from compass, ultrasonic sensors, and to connect to trailer’s inputs and outputs.

Machine vision

Logitech QuickCam Pro 5000 was used for machine-vision. It's a higher end webcam, which is still quite cheap. Camera is capable of 640*480 resolution, but 320*240 resolution was used to improve performance and this doesn't require so much data transfer. Camera sends about 25 images per second to laptop. Laptop processes about 20 images per second and calculates parameters and image convertions for each frame. Machine vision software was developed on Microsoft Visual C++ 2005 and Open Source Vision Library (OpenCV). OpenCV provides libraries and tools for processing captured images.

Image processing was done by EGRBI color transformation (Excess Green, Red-Blue, Intensity). At the beginning of algorithm, image is split into 3 spaces: red, green and blue. In EGRBI transformation a new image-space is created. Components are Green, Intensity and Red-Blue (cross product of green and intensity). Intensity is 1/3(R+G+B value).

The main idea of using this method was to detect green maize rows from dark soil. Green pixels could be detected from camera image by adding more weight on green and by ‘punishing’ red and blue.

Row detection

Estimate of robot's positioning and angle between maize rows was made by Hough transform. Hough transform is quite heavy to calculate. For each pixel (in binary image) 2 parameter values (angle, distance) are calculated. So the pixels in binary image have to be kept as low as possible. Hough transform returns many lines that fit with the pixels. 10 best lines from each side are taken and mean value of the best lines is calculated. As a result left and right lines are gotten. From these the positioning and angle errors are calculated. The information is send to controller which then calculates correct steering controls.

Dandelion detection

While driving between maize rows, dandelions must be detected and counted. EGRBI with a yellow mask was used to detect yellow. This was made by finding proper intensity levels and different weighting in R/G/B-values. After binarization each dandelion became a contour. Each contour has position and area. Each dandelion should be detected only once so the positions of contours between image frames were compared.


The microcontrollers only function as I/O devices and do very little processing to the data. One of them has the speed controller but apart from that all the intelligence is found in the main program which runs on a laptop PC. The main program was implemented in C++ .NET with MS Visual Studio. Camera- and machine vision functions were done using OpenCV-library. Controllers are designed with Simulink and compiled into DLLs which are loaded into the main program.


Control logics were developed in Simulink / Matlab. Each task was given an unique controller. Controllers were developed as a set of Simulink library blocks that could simultaneously be used in simulator and controller-export models. C code was generated from the Simulink model using real-time workshop. The final product from the simulink model was a Dynamic Link Library file. Reason behind the idea was to make the imports to the main program easier. The dll came with a function that was called from main program every 50 milliseconds. The function was given all the sensor information (ultrasonic, camera, odometry), as well as some parameters as an input. The output returned steering angles for front and rear wheels and speed, together with some other controls and debugging information depending on the task.


Simulator was implemented to better design the controllers. Implementation was done using Matlab and Simulink. It can be used to simulate the use of ultrasonic sensors. The simulator has an editor that can be used to create test tracks. The tracks also contain a correct route that is compared to simulated route to calculate error values. These values can be used to measure how good the controller is.

Freestyle: Soil property measurement trailer

The trailer was built from scratch. The platform had to be made big enough to fit all of the equipment and with enough ground clearance to give the linear motor enough room to function. Old Tamiya Clod Buster wheels were used together with a new spindle that was made to increase manoeuvrability.

The main equipment used in the trailer is a linear motor provided by Linak. All the electronics have been fitted into two boxes. The linear motor is operated by two relays. The primary idea for the freestyle was to measure the penetration resistance of the soil (cone-index) and simultaneously measure the moisture of soil by measuring the electrical conductivity of the soil.

The linear motor was used to thrust two spikes into soil. Soil penetration resistance could then be measured from the amount of current consumption changes in the linear motor. Simultaneously the electrical conductivity of the soil was measured with an input voltage of 5V directly from the microcontroller of the robot. This was an experiment and it has to be remembered that the soil conductivity measurements rely heavily on other properties of the soil, especially on the amount of different salts there are in the soil.


Quite many algorithms and other possibilities were researched and constructed, yet never used in the final product. They have been a bit of wasted time in making the robot, but might not be such a waste as far as the learning goes. While there were little problems with the laptops, they might still be considerable choice for this kind of projects. Laptops provide enough processing power for machine vision, and can also be utilized in the development part. It was however noticed that the UI should have been on a remote computer. The machine vision should be made more adaptive to reduce any need for manual calibration, to really get use of this kind of application. The use of Matlab and Simulink turned out to be very useful in the development of control algorithms, especially in the initial testing and debugging. Being able to first test the controller with a simulator and the directly export it to the main program was a great help. The use of a stateflow block was also found useful as it made the understanding of the process easier for other team members, and it made the debugging faster. The mechanics performed well, especially having the middle joint instead of spring suspension has seemed to be a good choice.

Final report

FRE2006_demeter.pdf pdf


The team was sponsored by:


Timo Oksanen