For moving the Frankas:
franka_descriptionfor the Franka URDFs;franka_ros2for the ROS2 franka integration. This, in turn, requireslibfrankato be installed on the machine;franka_ros2_multimanualfor the bimanual setup hardware interface;
For the motion planner:
- clone the
mdvcpplibon theinversebranch in the ROS 2 workspace;
For the human body tracker:
- install the
torchure_smplxlibrary (more instructions there). - put in the workspace src the
smpl_roslibrary by following the instructions in the corresponding repository.
- Change F/T sensor name under
bota_ft_sensor_driver/rokubimini_serial/rokubimini_serial.launch.pyto the right model (e.g. BFT-DENS-SER-M8) - If compilations error in
mdv_cpp_libarise, comment the line#include <rerun/archetypes/series_points.hpp>in bothrerun.cppandrerun.hpp
All other dependencies for the ROS 2 packages can be easily installed through rosdep:
- update the package list
sudo apt-get update; - update the rosdep database
rosdep update; - install the dependencies:
rosdep install --ignore-src -y --from-paths </path/to/ws>/src
- Task 1: The robot on the right pick and place the small block on the right to the mounting and the operator screw the two screws.
- Task 2: The robot on the left pick and place the big block on the left over the right hole using a visuo/tactile peg-in-hole strategy, then it keep it in place while the human screws it.
- Task 3: The robot on the left, fully autonomously pick the cable and put it on the plug using a visuo/tactile peg-in-hole strategy
This following scheme is the block diagram of the demo:
---
config:
theme: 'neutral'
---
graph TD;
A(Orchestrator)-->B(Motion Planner & Action Monitoring);
C(Registered Mesh)-->B
B-->D(Impedance Controller Robot Left);
B-->E(Impedance Controller Robot Right);
F(Human Body Tracker)-->A(Orchestrator);
D-->G(Task 2-3);
E-->H(Task 1);
-
On one terminal launch the following:
ros2 launch inverse_bringup inverse_ur.launch.pyThis will:
- create the connection with the UR robot; the robot will start using the
cartesian_compliance_controllerwith the parameters defined in theur_controllers.yamlfile; - spawn RViz;
- TODO: add all frames to the launch file
- start the motion planner.
- create the connection with the UR robot; the robot will start using the
-
For starting the execution of the actual task, on another terminal you must run the orchestrator as follows:
ros2 run inverse_orchestrator orchestratorTo properly decide which task to execute, you must first make sure to update the
orchestratefunction within theOrchestratornode defined inorchestrator.py.As of now, within the program execution, the human input is required as trigger to continue the motion plan. This can be simply achieved by pressing enter on the keyboard whenever the robot is in grasp/release position. The robot will then take care to resume the execution.
To record the a human skill (DMP based) that then can be executed by the robot:
- Use the following launch file to start the robot in gravity compensation
ros2 launch inverse_bringup teach.launch.py - Once the robot is in the desired initial configuration, call the service
/learn_skillas follows:Feel free to adjust the parameters at your will. About the number of basis, chose a low number (10-12) if you need to do a simple task, while increase up to 40-50 for very complex motions.ros2 service call /learn_skill inverse_msgs/srv/LearnSkill " skill_name: '' registration_duration_secs: 10.0 num_basis: 30"
Note:
When learning, only 1 trajectory of the robot is actually recorded. The recorded pose is the one of a TF2 transform of the frame specified in the node_parameters.yaml file, specifically in the arguments for the skill_learner node.
For the setup with UR, the tool0 frame is reference to the TCP of the robot.