Skip to content

hannesmoehring/helios

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

89 Commits
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

Helios - SLAM Agent & Supervisor System

Helios is a real-time SLAM (Simultaneous Localization and Mapping) system designed for autonomous vehicles, consisting of a Python-based agent for sensor processing and navigation, and a Next.js-based supervisor for 3D visualization and monitoring.

IMPORTANT NOTICE

This project is still ongoing and while it already has some capabilities it is by no means complete.

πŸ—οΈ System Architecture

The system is divided into two main components:

1. Agent (Python)

The autonomous navigation and SLAM processing engine that runs on the vehicle hardware.

2. Supervisor (Next.js + React)

A web-based 3D visualization dashboard for real-time monitoring and point cloud visualization.


πŸ€– Agent System

Overview

The agent is a multi-threaded Python application that integrates sensor data, performs SLAM calculations, and controls vehicle navigation. It processes data from LiDAR, gyroscope, and optical flow sensors to build a real-time 3D map of the environment.

Core Components

πŸ“‘ Sensor System (src/sensors/)

PointHandler (Argus)

  • Central sensor fusion hub that coordinates all sensors
  • Manages synchronized data collection from multiple sensors using threading barriers
  • Supports two operational modes:
    • Static Mode: For stationary platforms (LiDAR + Gyro)
    • Moving Mode: For moving platforms (LiDAR + Gyro + Optical Flow)
  • Uses multi-threading with queues for high-frequency data processing (supports up to 50,000 Hz)

Supported Sensors:

  • LidarSensor: TFmini-S distance measurements via serial (115200 baud)
  • GyroSensor: Roll, pitch, yaw orientation data via MSP protocol
  • OpticalFlowSensor: Position tracking for moving platforms

Point Calculation (calculations.py)

  • calc_point_static(): Transforms sensor readings into 3D coordinates for stationary platforms
  • calc_point_moving(): Accounts for vehicle motion during point calculation
  • Uses rotation matrices and trigonometric calculations with Numba JIT compilation for performance

🧭 Navigation System (src/nereus/)

Neures Navigator

  • Manages autonomous exploration and path planning
  • Implements two operational modes:
    • Auto Mode: Autonomous exploration with random movement patterns
    • Manual Mode: Interactive user control

Key Features:

  • Point Clustering: Groups nearby points using distance-based algorithms
  • 360Β° Scanning: Rotational scanning for comprehensive environment mapping
  • Slice Analysis: Analyzes point cloud slices for navigation decisions
  • Real-time Processing: Converts incoming sensor points to navigable clusters

Navigation Algorithms (nav_math.py)

  • Distance calculations and angular placement checking
  • Polar coordinate transformations
  • Group detection and clustering algorithms
  • Optimized with Numba for real-time performance

πŸš— Vehicle Control (src/vehicle/)

Ground Vehicle (Car)

  • Dual motor control using PWM and GPIO pins
  • Motor driver integration (TB6612FNG compatible)
  • Speed control: -100 to +100 range
  • Braking and coasting modes
  • Interactive control mode for manual operation

Hardware Pins:

  • Motor A: PWM=12, IN1=17, IN2=22
  • Motor B: PWM=13, IN1=5, IN2=6
  • Standby: GPIO 27

πŸ”Œ Communication (src/communicator.py)

WebSocket Server

  • Real-time bidirectional communication with supervisor
  • Streams point cloud data in binary format (big-endian floats)
  • Supports multiple client connections
  • Chunked data transmission (10 points/chunk, 10ms interval)
  • Auto-reconnection with configurable retry logic

Data Protocol:

  • Points transmitted as (x, y, z, group_id) tuples
  • Binary packed using struct.pack() for efficiency
  • JSON messages for commands and control

πŸ“Š Visualization (src/ouranos/)

Voxel Grid System (voxel.py)

  • Efficient spatial data structure for point cloud storage
  • Chunked voxel implementation for memory optimization
  • Configurable resolution and bounds

Visualization Tools (visualization.py)

  • Matplotlib-based 3D plotting
  • Scatter and voxel rendering modes
  • Real-time point cloud display

Configuration

All settings are managed through config.yaml:

network:
  socket_uri: "ws://0.0.0.0:8765"
  reconnect_interval: 5

navigation:
  enabled: true
  mode: "auto" # auto/manual

vehicle:
  enabled: true

sensors:
  tfmini_s: # LiDAR
    enabled: true
    port: "/dev/ttyUSB0"
    frequency: 100

  gyro:
    enabled: true
    port: "/dev/ttyACM0"
    frequency: 100

  optical_flow:
    enabled: false
    height: 3 # cm from ground

System Flow

β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
β”‚   Sensors   β”‚ (LiDAR, Gyro, OptFlow)
β””β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”˜
       β”‚
       β–Ό
β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
β”‚   Argus     β”‚ (Point Handler)
β”‚  Calculates β”‚
β”‚  3D Points  β”‚
β””β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”˜
       β”‚
       β”œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
       β–Ό          β–Ό
β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”  β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
β”‚ Navigatorβ”‚  β”‚WebSocket   β”‚
β”‚ Clusters β”‚  β”‚Communicatorβ”‚
β”‚  Points  β”‚  β””β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”˜
β””β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”˜        β”‚
      β”‚             β”‚
      β–Ό             β–Ό
β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”  β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
β”‚ Vehicle  β”‚  β”‚ Supervisor β”‚
β”‚ Control  β”‚  β”‚  (Web UI)  β”‚
β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜  β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜

Threading Architecture

  1. Main Thread: Initialization and coordination
  2. Sensor Threads: One per enabled sensor (LiDAR, Gyro, OptFlow)
  3. Point Calculation Thread: Processes sensor data into 3D coordinates
  4. Navigation Thread: Clustering and path planning
  5. WebSocket Server Thread: Client communication
  6. Vehicle Control Thread: Motor control and movement execution

Installation & Usage

Requirements

cd agent
pip install -r requirements.txt

Key Dependencies:

  • numpy - Numerical computations
  • numba - JIT compilation for performance
  • websockets - Real-time communication
  • pyserial - Sensor communication
  • gpiozero - GPIO control for motors
  • matplotlib - Visualization
  • PyYAML - Configuration management

Running the Agent

cd agent
python -m src.main

The system will:

  1. Initialize sensors based on config.yaml
  2. Start sensor data collection
  3. Begin point cloud generation
  4. Connect to WebSocket clients
  5. Start autonomous navigation (if enabled)

πŸ–₯️ Supervisor System

Overview

A Next.js-based web application providing real-time 3D visualization of the SLAM point cloud data using React Three Fiber.

Technology Stack

  • Next.js 16: React framework with server-side rendering
  • React 19: UI library
  • React Three Fiber: Three.js React wrapper for 3D rendering
  • @react-three/drei: Helpers and abstractions for R3F
  • Three.js: 3D graphics library
  • TailwindCSS: Styling framework

Features

  • Real-time 3D Point Cloud Rendering: Displays incoming sensor data as 3D boxes
  • Instanced Rendering: Efficient rendering of thousands of points using InstancedMesh
  • Group Color Coding: Different colors for different point clusters
  • Interactive Camera: OrbitControls for navigation
  • Grid and Environment: Visual reference frame

Installation & Usage

cd supervisor
pnpm install
pnpm dev

Access the dashboard at http://localhost:3000

WebSocket Integration

The supervisor connects to the agent's WebSocket server to receive:

  • Real-time point cloud data
  • Point clustering information
  • Vehicle status updates

πŸš€ Quick Start

1. Start the Agent

cd agent
# Edit config.yaml for your hardware setup
python -m src.main

2. Start the Supervisor

cd supervisor
pnpm install
pnpm dev

3. Access Dashboard

Open http://localhost:3000 in your browser to see real-time 3D visualization.


πŸ”§ Development & Testing

Test Mode

Enable test mode in config.yaml:

app:
  test_mode: true
  debug_mode: true

This uses pre-generated test data instead of reading from hardware sensors.

Debug Mode

Captures additional diagnostic data for analysis:

  • Raw sensor readings
  • Calculated point coordinates
  • Timing information

πŸ“ Project Structure

helios/
β”œβ”€β”€ agent/                      # Python SLAM agent
β”‚   β”œβ”€β”€ config.yaml            # Configuration file
β”‚   β”œβ”€β”€ requirements.txt       # Python dependencies
β”‚   └── src/
β”‚       β”œβ”€β”€ main.py           # Entry point
β”‚       β”œβ”€β”€ communicator.py   # WebSocket server
β”‚       β”œβ”€β”€ sensors/          # Sensor drivers
β”‚       β”‚   β”œβ”€β”€ PointHandler.py    # Sensor fusion
β”‚       β”‚   β”œβ”€β”€ LidarSensor.py
β”‚       β”‚   β”œβ”€β”€ GyroSensor.py
β”‚       β”‚   β”œβ”€β”€ OpticalFlowSensor.py
β”‚       β”‚   └── calculations.py    # Point math
β”‚       β”œβ”€β”€ nereus/           # Navigation
β”‚       β”‚   β”œβ”€β”€ navigation.py      # Main navigator
β”‚       β”‚   └── nav_math.py       # Math utilities
β”‚       β”œβ”€β”€ vehicle/          # Vehicle control
β”‚       β”‚   β”œβ”€β”€ ground.py         # Motor control
β”‚       β”‚   └── vehicle.py
β”‚       └── ouranos/          # Visualization
β”‚           β”œβ”€β”€ voxel.py
β”‚           └── visualization.py
β”‚
└── supervisor/                 # Next.js dashboard
    β”œβ”€β”€ package.json
    └── src/
        └── app/
            β”œβ”€β”€ page.tsx       # Main 3D viewer
            └── layout.tsx

βš™οΈ Hardware Requirements

Sensors

  • TFmini-S LiDAR: Serial connection (USB / UART)
  • Gyroscope: MSP protocol (USB/Serial)
  • Optical Flow Sensor (optional but recommended): SPI connection

Vehicle Platform

  • Motor Driver: TB6612FNG or compatible
  • GPIO: Raspberry Pi or compatible SBC
  • Power: Appropriate voltage for motors and sensors

πŸ“ Notes

  • The system supports frequencies up to 50kHz for sensor data processing
  • Point calculation is optimized with Numba JIT compilation
  • WebSocket communication uses binary format for efficiency
  • The voxel grid system provides memory-efficient spatial storage
  • Navigation runs autonomously when enabled in auto mode

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 2

  •  
  •