The goal of this repository is to implement expressive motion in robots using laban settings.
This ROS1 package has been tested on ROS Version melodic installed as ros-desktop-full-install on Ubuntu 18.04.4(bionic) on the laptop + Raspberry Pi 4 Model B. This package was tested on a Neato Robot.
- move-base
- map-server
On raspberry pi:
- https://github.com/charisma-lab/neato_controller (7dwarves branch)
On laptop:
- https://github.com/charisma-lab/expressive_motion (master branch)
git clonethem incatkin_ws/src- From
catkin_wsdirectory , runcatkin_makefollowed bycatkin_make install
Using https://chev.me/arucogen/ print Aruco Markers, 1, 5 and obstacle of size 4x4 and size at least 200.
Place the aruco markers - 1 on the robot, 5 on the hat (the object which will go on person) and obstacle to mark one corner of the area which the robot will be operating in.
On the laptop, as well as the raspberry pi make sure you have the following bash variables set-up:
- NEATO_NAME
- ROS_MASTER_URI
- NEATO_PORT
You will need to connect the robot and the laptop to the same network and then run a ros master-slave configuration between them. ROS network setup
You will need multiple terminals. Make sure not to confuse the laptop's terminals with the ones where you are running a SSH session with the raspberry pi
- Edit
catkin_ws/src/neato_localization/scripts/tracking_aruco_markers.pyto change camera number, on the laptop. - In first terminal, run
roslaunch neato_localization neato_localization.launch. This is to start localization package. Using the overhead camera it will start the localizer and publish poses and orientation of all the markers it sees, on the laptop. - To visualize if every marker and robot is positioned correctly, in a new terminal on the laptop, run
rvizand the select the appropriate pose topics.
- In first terminal, run
roslaunch neato_controller bringup.launch. This is to start neato robot's low level driver, on the raspberry pi.
- In second terminal, run
roslaunch neato_localization generate_waypoints_for_motion.launch. This is to generate path for a particular emotion, on the laptop. This expects user's input on the/TSI_valuestopic in order to generate a motion pattern for the robot to follow. - In third terminal, run
rosrun neato_localization test_gui.py, on laptop. This rosnode is for the user to set inputs to the system. The user can choose to either provide the inputs via terminal commands or via the GUI. For the inputs the user can either choose time-space-interaction values or check a robot state of his choice.
-
Input via terminal Run
rosrun neato_localization test_gui.py -t 0.1 -s 0.2 -i 0.3for setting time-space-interaction values via terminal commands Runrosrun neato_localization test_gui.py -state happyfor setting robot-state via terminal commands. You can choose happy/grumpy/sleepy for robot states. -
Input via GUI Run
rosrun neato_localization test_gui.py. This will bring up the GUI. You can choose your inputs and then hit deploy.
All these user inputs will be published on the /TSI_values topic.
- In second terminal, run
roslaunch neato_planner pure_pursuit.launch. The path generated from gen_trajectory_behavior, will now be fetched by this node and then implemented by the robot, on the raspberry pi.