Skip to content
Marc Hanheide edited this page Feb 18, 2025 · 25 revisions

Besides the following tasks, make sure you have caught up with all the previous tasks.

IMPORTANT: You are expected to bring your source code from the previous workshop(s) to every workshop again. Also you are expected to take your own notes, e.g. about the commands you use, how things worked, etc, and have them available for every session. The workshops are successively building on top of each other. You need the knowledge from the previous workshops to engage in this one. It is strongly advised to store all your coding after every session on your github repository. Why not use the README.md file in your repository as the perfect store for all your notes?

Task 1: Exploring Kinematics

In the lecture the "kinematics" for a robot with a differential drive (like we use the Limos) have been introduced.

You already know that your robot accepts a geometry_msgs/Twist message on the topic /cmd_vel to receive motion commands. This Twist message defines the linear velocity and the angular velocity, which the limo driver converts into angular velocities ω using the "inverse kinematics".

However, assume you want to command your robot using wheel angular velocity yourself, so, you want to define the angular velocities ω for each wheel. Then, you need to use the forward kinematics to find the values for a suitable Twist message from given ω values. Your task is to extend the fragments from the lecture to

  1. subscribe to a new topic called /wheel_vel_left of type std_msgs/Float32 (allows to send single float values)
  2. Take the input from the topic above and use it as the angular velocity (in rad/s) of the left wheel of your robot to then compute a geometry_msgs/Twist message to be published on the topic that makes the robot move (remember, it had “velocities” in its name). Assume that the velocity of the right wheel should be is ω=0.0.
  3. adjust the robot_radius and wheel_radius in our code to the correct values.
  4. Which behaviour of your robot do you expect when you run ros2 topic pub /wheel_vel_left std_msgs/Float32 "data: 1.0" -r 10 while your implemented node is running? Does your robot behave as expected?

Task 2: continuation of last week's colour chaser (working towards assignment)

Following on from last week's colour segmentation and chaser and the example given in the lecture, now implement a simple proportional controller:

  1. As usual launch the tidybot simulation.
  2. Write a little bit of Python code to drive towards one of the coloured boxes. You may, in addition to your own solution from last week, take the colour chaser robot implementation, which simply rotates the robot to line up with a coloured object, as a start.
  3. All you need to do is to control the robot's linear and angular velocity according to the position of the detected box. So, your robot's task is to chase after one specific colour.
  4. You have learned about proportional control in the lecture. Can you implement a proportional controller when aiming for the object? What would be the benefit of this as opposed to an if...then construct?
  5. Think about how you can model the situation that you drove so close that you cannot see the object any longer but you may want to continue to push it? How would you know when to stop?

Task 3: Real Robot

Attempt this only if you have completed your work in simulation. By now you should have a working dev container that runs the simulation, and you should have a basic robot behaviour that rotates the robot towards a coloured object. You should be familiar with how to manage your code and how to subscribe to images and laser scan topics, and how to publish to make the robot move. If you are not sure about any of this, ask a member of the delivery team!

  • Work in the same pairs as last week if you are already part of a pair. Use the same robot. The delivery team will help you.
  • Read Using the real Limo Robot to get started. Remember to check the robot out for use by you and your team mate.
  • Specifically, follow the instructions to connect your dev container to the real robot and check that you can see all the topics from the actual robot
  • Take your implementations from the past weeks (turning towards coloured objects, using a laser scanner to avoid obstacles) and make them work on the real robot. You may have to change a few topic names and adjust parameters. Try to make your code so easily configurable that it can easily work either in simulation or reality.

Remarks

  • This is the last of the mini tasks, so make sure you complete this and you really fully understand what the code you produced does. From now on, you will be working towards your individual assignment. Make sure you read the assessment brief again and ask anything unclear as soon as possible.

Clone this wiki locally