Skip to content

CodeSythe/2023-YY_ROVER_ACTIVE

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

77 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Project Rover

This repo contains all the resources used in the rover project started as part of IROC-2024.

Objectives

  • We have attempted to build a lunar exploration rover, where we combines features like LiDAR and stereo vision with a sequential operating strategy, for accurate mapping of the terrain.
  • Radio communication to be used for manual control and autonomous commanding to be implemented through the Jetson microcontroller, based on pre-trained ML models.
  • We have also included high-torque DC motors for improved stability, and an individual camera on the arm to ensure precision while sample picking.

Working Idea

Below diagram shows the control flow of the rover.

  • The primary processing unit (Jetson) controls is responsible for spatial data collection and using stereo-vision computations, will map out the terrain.
  • The generated map will be used to send instructions to the wheel motors through the roving motor actuation, to navigate the terrain.
  • Jetson will also be connected through WiFi to the Control Centre, where the team can access and control the rover and manipulator actions.
  • Individual components (eg: arm-mounted camera and motors, ultrasonic and force sensors etc.) will be controlled via the secondary processing unit (presumably an Arduino Mega). This is mainly to reduce the load on Jetson. This secondary processing unit is entirely responsible for the arm and gripper operations. Ultrasonic and Force sensors are to ensure that the gripper collects the target object with precision.

Current Progress

Algorithm

❌ The primary operating system - Jetson, based on the stereo-vision computations, will map out the local environment and implement the Simultaneous Localization and Mapping (SLAM) algorithm. Heuristic path-finding and re-planning on encounter with obstacles will be performed repeatedly as the rover autonomously navigates the terrain. The algorithm will also be trained to detect hurdles in the path and will autonomously decide whether to maneuver past them or not.

✅ The pre-trained ML model will identify the required pickup object (in this case, a cylinder). Using the camera attached to the arm, the rover estimates the position of the target at will perform pickup operations as necessary (mostly done, except for the position estimation).

Rover Control

✅ Individual software based control modules have been build for different mechanical parts of the rover. However, they are yet to be integrated with rover navigation.

✅ Inverse kinematics based arm control has also been implemented and tested. However, gripper control and pickup is yet to be implemented. The arm control flowchart is given below.

Control Centre (aka Web Integration)

❌ Proposed web integration module will act as a dashboard from which the team can manually view and control the rover and the arm movements.

Directory Tree

rover
│
└───Design Models
│   (Mechanical Design Documentation)
│
└───ObjectDetect
│   │   cylinder.py (Working code)
│   │ 
│   └───Dataset (Images used for training)
│   └───runs
│       │
│       └───train2 (Working ML model)
│       
└───Sensor Integration
│   │ (arduino based code to control the mechanics of the rover)
│   │ 
│   └───inverse kinematics
│   │   ...

About

Rover being built by the RoboTech Club. It's the continuation from the ISRO Robotics Challenge 24

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages

  • Jupyter Notebook 58.8%
  • Python 33.1%
  • HTML 8.1%