This ROS package utilizes YOLOv8 for real-time object detection in images received from a camera topic. The package is designed to detect specific objects, such as stop signs, tires, and pedestrians, based on user requests.
- ROS (Robot Operating System)
- YOLOv8
- OpenCV
- Ultralytics
-
Clone this repository to your ROS workspace's /src folder:
git clone https://github.com/LTU-Actor/Route-Yolov8.git
-
Build the ROS package:
cd <ros_workspace> catkin_make
-
Launch the ROS node:
roslaunch ltu_actor_route_yolov8_detector yolov8_detector.launch
-
Subscribe to the relevant topics to trigger object detection, set these parameters during launch:
- Image Topic:
<imgtopic> - Object Detection Topic:
<look_for_object_topic_name>(string type, add unique strings and models in detect.py as needed)
- Image Topic:
-
View the results:
The package provides detection results for stop signs, tires, and pedestrians. The detected objects, along with their sizes, are published to specific topics.
-
Debugging:
If debugging is enabled (set in the rqt dynamic reconfigure), annotated images showing the detected objects are published to debug topics.
The package supports dynamic reconfiguration through the ROS Parameter Server. Parameters such as image resizing, flipping, and debugging can be adjusted on-the-fly.
Yolo v8 models can be added or swapped out in the /models folder and names updated in the launch file.
-
Inputs:
- Image Topic:
<imgtopic_name> - Input String Topic:
<look_for_object_topic_name>
- Image Topic:
-
Stop Sign Detection:
- Detected Topic:
<stop_sign_detected_topic_name> - Size Topic:
<stop_sign_size_topic_name> - Debug Topic:
<stop_sign_debug_topic_name>
- Detected Topic:
-
Person Detection:
- Detected Topic:
<person_detected_topic_name> - Size Topic:
<person_size_topic_name> - Debug Topic:
<person_debug_topic_name>
- Detected Topic:
-
Tire Detection:
- Detected Topic:
<tire_detected_topic_name> - Size Topic:
<tire_size_topic_name> - Debug Topic:
<tire_debug_topic_name>
- Detected Topic:
Contributions are welcome!
This project is licensed under the MIT License.