A comprehensive autonomous navigation system featuring a mobile bot that explores environments using SLAM (Simultaneous Localization and Mapping) and RRT* path planning. The system ensures safe navigation by preventing movement into unknown areas.
This project implements a complete autonomous navigation stack with two main components:
- Mobile Bot (ESP32): Hardware platform that performs environment scanning and movement execution
- Central Computer (Windows): Server system that processes sensor data, runs SLAM algorithms, and plans safe paths
- Real-time Environment Mapping: Uses ultrasonic sensors to build 2D maps of surroundings
- Safe Navigation: Implements RRT* path planning with strict unknown-area avoidance
- Autonomous Exploration: Frontier-based exploration for efficient environment coverage
- Live Visualization: Real-time topology mapping and bot position tracking
- Multi-module Architecture: Clean separation of concerns with well-defined interfaces
┌─────────────────┐ WiFi/TCP-IP ┌──────────────────┐
│ Mobile Bot │◄────────────────►│ Central Computer │
│ (ESP32/MCU) │ │ (Windows) │
└─────────────────┘ └──────────────────┘
│ │
┌────────┴────────┐ ┌─────────┴─────────┐
│ Hardware Modules│ │ Software Modules │
│ • Ultrasonic │ │ • SLAM Navigator │
│ • Servo Motor │ │ • RRT* Planner │
│ • Stepper Motor │ │ • Topology Viz │
│ • WiFi │ │ • Server Comms │
└─────────────────┘ └───────────────────┘
bot_side/
├── boot.py # Hardware initialization & system boot
├── main.py # Main controller & workflow orchestration
├── area_scanner.py # Ultrasonic sensor scanning module
├── movement_controller.py # Stepper motor control for locomotion
└── communication_handler.py # WiFi communication with central computer
server_side/
├── main_system.py # Main autonomous navigation system
├── server_comms.py # Bot communication handler
├── slam_navigator.py # SLAM & RRT* path planning
└── topology_visualizer.py # Environment visualization
- Microcontroller: ESP32 development board
- Sensors: HC-SR04 Ultrasonic distance sensor
- Actuators:
- SG90 Servo motor (for sensor scanning)
- 28BYJ-48 Stepper motor with ULN2003 driver (for locomotion)
- Power: 5V power supply for motors, 3.3V for ESP32
- Connectivity: WiFi for communication
- OS: Windows 10/11
- Python: 3.7 or higher
- Dependencies: NumPy, Matplotlib, SciPy
# Upload MicroPython files to ESP32
ampy --port COM3 put boot.py
ampy --port COM3 put main.py
ampy --port COM3 put area_scanner.py
ampy --port COM3 put movement_controller.py
ampy --port COM3 put communication_handler.pyHardware Connections:
# GPIO Pin Configuration (update in boot.py)
LED_PIN = 2 # Status LED
SERVO_PIN = 13 # Servo motor control
STEPPER_STEP_PIN = 12 # Stepper motor step
STEPPER_DIR_PIN = 14 # Stepper motor direction
ULTRASONIC_TRIGGER_PIN = 5 # Sensor trigger
ULTRASONIC_ECHO_PIN = 18 # Sensor echoWiFi Configuration:
Update credentials in boot.py and communication_handler.py:
ssid = 'YOUR_WIFI_SSID'
password = 'YOUR_WIFI_PASSWORD'
server_ip = '192.168.1.100' # Central computer IP# Clone the repository
git clone https://github.com/yourusername/autonomous-mobile-bot.git
cd autonomous-mobile-bot
# Install Python dependencies
pip install numpy matplotlib scipy
# Start the central server
python server_side/main_system.py-
Power the Mobile Bot: The bot will automatically:
- Initialize hardware components
- Connect to WiFi
- Send system status to central computer
- Wait for start signal
-
Start Central Server:
python server_side/main_system.py
-
Begin Autonomous Operation:
- Use the command interface to monitor system status
- The system will automatically begin scanning and navigation
The central server provides an interactive command interface:
ANS> status # Show system status
ANS> bots # List connected bots
ANS> save_map # Save current map as image
ANS> help # Show available commands
ANS> stop # Gracefully shutdown systemGLOBALS['config'] = {
'movement': {
'steps_per_revolution': 200,
'wheel_diameter_cm': 6.5,
'wheel_base_cm': 15.0,
'step_delay_ms': 2
},
'scanning': {
'scan_angle_min': 0,
'scan_angle_max': 180,
'scan_step_angle': 10,
'servo_speed_delay': 0.3
}
}self.exploration_goal_distance = 1.0 # meters
self.safety_margin = 0.3 # meters- Occupancy Grid Mapping: Probabilistic grid-based environment representation
- Scan Matching: Simplified pose estimation using sensor data
- Ray Casting: Bresenham's algorithm for efficient free-space marking
- Safety First: Strict avoidance of unknown areas
- Optimal Paths: Asymptotically optimal through tree rewiring
- Real-time Planning: Efficient sampling-based approach
- Autonomous Discovery: Automatically identifies exploration targets
- Known Area Constraint: Only moves within mapped regions
- Efficient Coverage: Maximizes new area discovery per movement
The system provides multiple forms of feedback:
- Real-time Topology Maps: Live updating environment visualization
- System Logs: Comprehensive logging for debugging and monitoring
- Statistics: Mapping coverage, path efficiency, and system performance
- Saved Maps: High-resolution images of explored environments
-
WiFi Connection Failed
- Verify SSID and password in both
boot.pyandcommunication_handler.py - Check central computer firewall settings
- Verify SSID and password in both
-
Sensor Reading Errors
- Verify ultrasonic sensor wiring (Trigger, Echo, VCC, GND)
- Check for obstructions in sensor path
-
Motor Control Issues
- Verify stepper motor driver connections
- Check power supply adequacy for motors
-
Communication Timeouts
- Ensure both devices are on same network
- Verify central server IP address configuration
- Enable detailed logging in both bot and server
- Use the status LED patterns for hardware diagnostics
- Check serial monitor for ESP32 debug output
- Use the command interface for real-time system monitoring
- Mapping Accuracy: Sub-5cm resolution occupancy grids
- Path Planning: 1000 iterations for optimal path finding
- Communication: Reliable TCP with checksum verification
- Scan Speed: 18-point 180° scan in ~6 seconds
- Multi-bot coordination and swarm intelligence
- Advanced SLAM algorithms (ORB-SLAM, Cartographer)
- 3D environment mapping with additional sensors
- Web-based dashboard for remote monitoring
- Machine learning for improved path planning
This project is licensed under the MIT License - see the LICENSE file for details.
- Fork the repository
- Create a feature branch (
git checkout -b feature/amazing-feature) - Commit your changes (
git commit -m 'Add some amazing feature') - Push to the branch (
git push origin feature/amazing-feature) - Open a Pull Request
- Hector SLAM algorithm by Stefan Kohlbrecher et al.
- RRT* path planning algorithm by Sertac Karaman and Emilio Frazzoli
- MicroPython community for embedded systems support
- Matplotlib and NumPy communities for visualization and computation tools
Note: This project is for educational and research purposes. Always ensure safe operation when deploying autonomous systems in real-world environments.
For questions and support, please open an issue on GitHub or contact the development team.