Skip to content

flatypus/umbrella

Repository files navigation

Umbrella

World's first homemade, autonomous tracking, flying drone umbrella.

It's done. My high school dream.

image

"No one is going to do this. Not a SINGLE person who reads this code will do this" - John IBuildStuff

This project has been in the works for three years now, since July 2022, when I barely knew any programming and John was still building rockets. Some of our earliest designs were honestly terrible, like one involving four drones and a cloth, each lifting one corner. Months later, as John's channel was really taking off, he spent a few months putting together the first prototype which eventually turned into this incredible video. While the original idea of 'umbrella on drone' would've probably made for a boring video in my hands, John's filmmaking really shines through. After the video was posted, dozens of people started discussing the possibility of it following you around handsfree, including one Intel engineer who quoted "It doesn’t follow you autonomously, but it would be easy enough to do." Unfortunately, we aren't PhD software engineers, so it took us a little more time to figure out.

ToFLiveTrack.mp4

BTW if ANYONE can explain why we can see that reflection of the umbrella drone in the depth view, I'd love to know. Make an issue!

Common video questions:

  • Q: Why is the tracking so drifty?
    • A: Several reasons; primarily, we implemented PID control with the camera, but didn't have any other sensors (ie. IMU, GPS) integrated to actually perform really smooth tracking. I'd love to give it a shot sometime with like a Kalman filter + positional estimates, but keep in mind every time we went out, I only had 16 minutes max with two batteries to test out new code.
  • Q: That doesn't explain why the motion isn't smooth, though.
    • A: Wind conditions were ridiculous. Imagine running with an umbrella out; the faster you go into the wind, the more it wants to invert and give out, right? With the drone umbrella, the problem is to move in any direction, the drone must tilt, thus exposing more surface area to be affected by wind. It's only completely stable when flat. Additionally, the video doesn't show how often it goes into this rocking motion, where going forwards a bit would induce it to completely tilt back, which makes it super hard to control. However, I did want to add yaw control to actually turn the drone while it was moving, but ultimately didn't have time to get it to work. But if anyone's up for a challenge, I'd love to see a control theory buff give it a shot!
  • Q: Why does the umbrella hover so high above you? Shouldn't it be lower?
    • A: Yes, and we thought about this for a while. First of all, those blades are fast. John refused to fly it any less than a meter above our heads, so as a safety measure I hard-coded a min height of 3 meters above ground. Additionally, the ToF camera itself really doesn't have the best field of view, and flying it too close makes walking even at a normal pace leave the sensing area in less than one step.

Hardware Setup:

  • Obtain:
  • With Raspberry Pi Imager, flash Ubuntu 24.04 server (for Pi CM4)
    • Make sure to enable SSH and WiFi
  • Clone git clone https://github.com/ArduCAM/Arducam_tof_camera.git
    • cd Arducam_tof_camera and run ./Install_dependencies.sh
    • This will add dtoverlay=arducam-pivariety to the /boot/firmware/config.txt
    • Reboot now
  • Run sudo raspi-config
    • Options > Serial
    • Login shell over serial? → No
    • Enable serial hardware port? → Yes
  • Edit /boot/firmware/config.txt
    • make sure at below the first [all]
      • dtoverlay=disable-bt
      • dtoverlay=arducam-pivariety (move the above added line up)
      • enable_uart=1
    • Reboot now
  • Test if you can connect to FC
    • ls -l /dev/serial* /dev/ttyAMA* /dev/ttyS*
    • You should be able to see /dev/ttyAMA0
  • (Optional) Check that camera is connected
    • sudo dmesg | grep -i arducam
    • sudo v4l2-ctl --list-devices

Development Setup (Docker):

  1. docker-compose up -d
  2. docker-compose exec umbrella bash
  3. colcon build --symlink-install ALWAYS RUN FROM THE ROOT (not src!)!
  4. source install/setup.bash
  5. On subsequent starts, run 1-2 (no need to build or source install)

To rebuild the docker image, run docker-compose up --build -d

(BTW! I wasn't using ROS in the video demos, but I cleaned up the repository to be somewhat more manageable so if anyone wants to give simulation a shot in gazebo, they're welcome to fork and try! :)

Development:

  • You can directly just edit the code if you have ROS2 already setup on your system, but most likely you won't be.
  • If you don't have the ROS2 setup, then you won't have autocomplete/typedefs/important things that you probably want to develop with.
  • Instead, you can either use DevContainers, or just 'Remote-SSH' with ssh root@localhost -p 2267. The password is set in the Dockerfile as thispasswordissecure. Either of these methods should get you the proper type definitions. If not, make sure the interpreter is set to /usr/bin/python3

ROS2 Architecture

This project is structured as a ROS2 workspace with three packages:

  • umbrella_bringup: Launch files and health check
  • umbrella_sensing: Camera node for ToF camera processing and person tracking
  • umbrella_control: Flight control node with PID controllers and MAVSDK integration

Dependencies

  • We use uv to manage the python dependencies, but NOT for project running. That is, uv add and ./install.sh.
  • Disregard the .venv/, ros itself has it's own types and deps we need.

Execution

  • Turn on a Mobile Hotspot (set network SSID and password to one recognized by the Pi in setup)
  • SSH into the Raspberry Pi when on same network, or use Tailscale
    • ie. setup Tailscale network, add Pi to network & enable ssh tailscale set --ssh
    • Now you can ssh user@ip
  • Now, to setup the drone for flying:
    • Run ros2 run umbrella_bringup health_check to verify connections
    • Build the workspace with ./build.sh
    • Launch the system with ./launch.sh
    • Once the user puts the drone in OFFBOARD mode, the drone will rise to ~3 metres and hold altitude.
    • Walk under the drone. It should now be following you.
    • Tune PIDs for optimal tracking in src/umbrella_control/umbrella_control/flight_control.py

Utility Scripts

  • ./build.sh - Build the ROS2 workspace
  • ./launch.sh - Launch the umbrella system
  • ./install.sh - Sync uv dependencies to system Python
  • ./kill.sh - Kill all ROS2 processes and restart daemon

On old Ubuntu versions/Jetson

docker compose up -d docker compose exec umbrella bash

About

World's first homemade, autonomous tracking, flying drone umbrella.

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published