Skip to content

Algorithms

Berkay Köksal edited this page Jan 14, 2018 · 2 revisions

Besides several functions and the main(), code is organized into Coroutines. These Coroutines handle the core functionalities of the robot. "Coroutines are computer program components that generalize subroutines for nonpreemptive multitasking, by allowing multiple entry points for suspending and resuming execution at certain locations. Coroutines are well-suited for implementing more familiar program components such as cooperative tasks, exceptions, event loops, iterators, infinite lists and pipes." [wikipedia] Every coroutine has its own stack, its own local variables, and its own instruction pointer and it shares global variables and mostly anything else with other coroutines.

We have 3 coroutines : drive , proximity, touch handle

  • drive coroutine : It gets the command of movement ( moving forward, turning ,..) and executes it
  • proximity coroutine : Il handles the case when: the robot touchs an obstacle ( with the right touching sensor) and get backwards to fix its orientation the robot sees an obstacle in front of him, does a scan with an angle of 10 degrees, gets to the minimum distance and reads the color. that's where we calibrate the orientation . the robot decides when to drop the obstacle
  • touch : It handles the keyboard of the brick ( i.e :click cancel button to kill the robot)

Handling of Obstacles

There are two types of obstacles: movable and non-movable. Both objects need to be dodged, if possible without touching. The movable object is a small, red ball. The non movable objects are walls, chairs and other office inventory. When the robot approaches an obstacle, it measures the distance to it, moves in slowly and after "knowing" its color it will continue to go left or right from that obstacle.

Pseudo Code: Movable or non-movable?:

while the current distance is between 50 and 120

    if color sensor gets value and if color sensor equals "5" // 5 = COLOR_RED

        a movable object is detected

    if color sensor gets value and if color is not equal to "5"

        a non-movable object is detected

continue going forward

Exploring the Arena Semi-Randomly

if touch sensor activated
    go back one step
else
    if sensor_distance < 120 // something is in the way
        turn 5 degrees in each direction, keep measuring distance
        turn towards minimum distance // we are now looking directly at the object, i.e. 90° angle
        get close to the object // distance = 50
        // read color sensor (see previous code snippet)
    else
        if Y_value < 2 and direction == south // this check is necessary as the arena is not enclosed on the southern end
            // we are out of the arena

turn 90° in random direction

Handling of Touches

The code is designed in a way, that touching is avoided. However, due to the limitations of the Ultrasonic Sensor, it makes sense to include a functionality that handles the collision with an object. We only had the right to use 4 sensors, so we only have one touch sensor on the right side, not the left side. We intentionally start the match looking East, so the touch sensor is initially looking in the direction we are moving.

Cartography Strategy

The robot takes one step a step, stops, and sends its current position together with the value (R/G/B). This is repeated until the distance or touch sensor sends an interrupt. For each step, hardcoded time and speed to certain values so that the robot would go exactly 5cm each step. This makes it easier for us to keep track of where it is in the map. We hold a 256x256 array for position information. It is a character array and accepts the following values:

'G' // Green, means that coordinate is free
'B' // Blue, means there's an immovable object at that position
'R' // Red, means there's a movable object there

For all the objects seen or released, the robot decides on the position of the obstacles depending on its position (South, West, North, East).

Competitors

When the distance sensor is triggered, it is possible that this is caused by another robot, not an object. So when turning towards the minimum distance (aligning in a 90° angle towards the object), it is possible that the object has already moved away because it was a robot. We therefore implemented a break routine that aborts the algorithm if the minimum distance can't be found. Instead, we could simply wait between reading values, because if it is a robot, it will have moved away by the time we take the second value. But this would make us lose time, which is why we settled for the former approach.

Clone this wiki locally