Wheels of Corntune
Field Robot Event 2007
The robot is a built by four students from the Automation department of the Helsinki University of Technology (TKK) and two from the Agrotechnolgy department of the University of Helsinki (UH).
|Thomas Maksimow (captain) (TKK)||Timo Oksanen (TKK)|
|Jussi Hölttä (TKK)||Johannes Tiusanen (UH)|
|Erkki-Juhani Lämsä (TKK)||Juho Säteri (UH)|
|Juho Junkkala (UH)|
|Petri Koskela )TKK)|
|Mikko Posio (UH)|
The robots chassis is the same as Demeter’s (participant of the Field Robot Event 2006), but we have improved the wiring and we have installed new motor controllers and sensors. The chassis is custom built for this purpose and it offers stability and maneuverability. It is four wheel drive and four wheel steer.
We added new infrared sensors the lower hull and a gyro and infrared sensors to the upper hull. We have also improved the wiring to make the robot easier to maintain and modified the upper hull to support docking, which is our free style task. For the weeding task we have built two servo controlled “rakes” that lower when there are weeds that need to be raked.
The processing of camera and sensor information is done on a laptop computer that is on board the robot. The computer is connected to the robot trough two ATMEGA microcontrollers that provide a hardware interface to the servos and sensors. The microcontrollers are connected with serial ports to the laptop.
- Logitech QuickCam Pro 5000, web cam
- 5 ultrasound sensors
- 2 infrared sensors in the sides
- 3 infrared receivers in the front
- 2 optoencoders
The GUI is the main program that operates and contains all the other modules. It is a graphical user interface from which you can operate the robot. The other components provide the necessary methods for operating the robot, but the GUI calls them and acts upon them.
The camera program reads images from the webcam with OpenCV image processing tools. In normal driving mode the robot gets a RGB image and binarizes it with EGBRI transform (Extended Green, Blue-Red, Intensity). From this binary image it tries to fit Hough lines and after finding the 2 best fitting ones, it calculates the robot orientation in relation to the rows.
In dock search mode the robot makes 2 binary images from the original RGB images, one for searching red circles and the other for searching blue circles. From those images it looks Hough circles and after finding one from both images close enough to each other it can by simple algorithms tell which direction and how far the dock is.
In weed detection mode the robot makes one binary image in addition to the one for row detection. It is for detecting the weeds and is made with transform that is like EGBRI but the extended color is yellow (basically red+green). From the binary image the robot is able to track the weeds and tell when the robot should remove it.
Sensor fusion and control algorithms are made using Simulink. The logic of the robot is made using Simulink's Stateflow tool. Matlab's Real Time Workshop is used to generate C++-code from the model. The source code is then used in the main program.
The functionality of the model can be simulated using a self-made simulation tool. The maize field is modeled as well as all sensors and actuators of the robot. The graphical user interface of the simulator is also made using Matlab.
The IO module communicates with the two ATMEGA128 microcontrollers. The other microcontroller is used for controlling the robot and the other is for sensor inputs. The microcontrollers are not used for complex control; they mostly just provide a hardware interface to the laptop computer on board the robot.
RemoteGUI Interface, Parameters (C#)
The RemoteGUI Interface provides the necessary methods for transmitting and receiving data to and from the Remote GUI. All data is sent over WLAN with UDP protocol. The RemoteGUI Interface parses the received data and stores it for the GUI. The RemoteGUI Interface also supports the saving and loading of a Parameters class in XML format. The Parameters can then be accessed by the GUI or sent for modification to the Remote GUI.
The team was sponsored by: