Skip to content

Latest commit

 

History

History
187 lines (152 loc) · 6.49 KB

File metadata and controls

187 lines (152 loc) · 6.49 KB

UTracker - BitTorrent Tracker Aggregator

Project Overview

UTracker is a full-stack BitTorrent Tracker Aggregator that aggregates peers across multiple protocols (HTTP, UDP, and WebSocket) to provide a unified, high-performance torrent tracking service. The application consists of:

  • Backend: A FastAPI-based server implementing BEP-3 (BitTorrent Protocol) and BEP-15 (UDP Tracker Protocol)
  • Frontend: A React-based dashboard for monitoring tracker statistics and swarm activity
  • Real-time Updates: WebSocket integration for live peer/swarm monitoring

Key Features

  • Multi-protocol support (HTTP, UDP, WebSocket) for maximum client compatibility
  • Real-time peer aggregation across all protocols
  • Public tracker discovery and validation
  • Performance metrics and monitoring
  • Ultra-low latency swarm management

Tech Stack

Backend

  • Framework: FastAPI (Python 3.x)
  • Database: MongoDB (via Motor async driver)
  • Protocols: HTTP (BEP-3), UDP (BEP-15), WebSocket
  • Key Dependencies: aiohttp, bencode.py, pydantic, uvicorn

Frontend

  • Framework: React (v19) with Create React App
  • Styling: Tailwind CSS with Radix UI components
  • API Client: Axios
  • Routing: React Router DOM
  • UI Components: Radix UI primitives, Lucide icons

Project Structure

UTracker/
├── backend/
│   ├── server.py                 # Main FastAPI application with all endpoints
│   └── requirements.txt          # Python dependencies
├── frontend/
│   ├── public/                   # Static assets
│   ├── src/
│   │   ├── App.js               # Main React application component
│   │   ├── components/          # Reusable UI components
│   │   ├── hooks/               # Custom React hooks
│   │   └── lib/                 # Utility functions
│   ├── package.json             # Frontend dependencies and scripts
│   ├── craco.config.js          # CRA customization configuration
│   └── tailwind.config.js       # Tailwind CSS configuration
├── tests/                       # Test directory
├── backend_test.py              # Comprehensive backend testing suite
├── additional_tests.py          # Additional protocol compliance tests
└── README.md

Building and Running

Backend Setup

  1. Install Python dependencies:

    cd backend
    pip install -r requirements.txt
  2. Set up environment variables (create .env file):

    MONGO_URL=mongodb://localhost:27017
    DB_NAME=tracker_db
    CORS_ORIGINS=http://localhost:3000,http://localhost:3001
    
  3. Start the backend server:

    cd backend
    uvicorn server:app --host 0.0.0.0 --port 8001

    The HTTP tracker will be available at http://localhost:8001/api/announce The UDP tracker will automatically start on port 8002

Frontend Setup

  1. Install Node.js dependencies:

    cd frontend
    npm install
  2. Set up environment variables (create .env file):

    REACT_APP_BACKEND_URL=http://localhost:8001
    
  3. Start the frontend development server:

    cd frontend
    npm start

    The dashboard will be available at http://localhost:3000

Docker Setup

A multi-stage Docker build has been implemented to containerize the application. The fixed Dockerfile addresses issues with missing executables in the final image.

Docker Configuration (Fixed Issues):

  1. The original Dockerfile was missing the uvicorn executable in the final image, causing container startup failures.
  2. Fixed by copying both Python packages and executables from the builder stage:
    COPY --from=backend-builder /usr/local/lib/python3.11/site-packages /usr/local/lib/python3.11/site-packages
    COPY --from=backend-builder /usr/local/bin /usr/local/bin

Docker Environment Variables:

Docker Network and API Proxy: The Docker setup uses nginx to proxy API requests from the frontend to the backend:

  • Frontend runs on port 80
  • API requests to /api are proxied to the backend service at port 8001
  • WebSocket connections to /api/announce_ws are also proxied
  • This allows the React app to make relative API calls that are handled by nginx

Running with Docker:

docker-compose up --build

The application will be accessible at http://localhost (port 80), with API endpoints available at http://localhost/api/....

Running Tests

Backend tests:

cd backend
python backend_test.py
python additional_tests.py

API Endpoints

HTTP Tracker (BEP-3)

  • GET /api/announce - BitTorrent announce endpoint

UDP Tracker (BEP-15)

  • Port 8002 - UDP BitTorrent announce endpoint

WebSocket Tracker

  • WS /api/announce_ws - WebSocket announce endpoint

API Endpoints

  • GET /api/stats - Tracker statistics
  • GET /api/swarms - Active swarms information
  • GET /api/trackers - Public tracker list with status
  • POST /api/scrape_trackers - Refresh public tracker list
  • GET /api/metrics - Prometheus-compatible metrics

Development Conventions

Backend

  • Follow FastAPI conventions for request/response models
  • Use Pydantic models for data validation
  • Async/await pattern for all I/O operations
  • Thread-safe operations using asyncio.Lock

Frontend

  • Component-based architecture in React
  • Use functional components with hooks
  • Tailwind CSS for styling
  • Environment variables for configuration
  • Axios for API requests

Testing

  • Comprehensive backend testing with aiohttp and websockets
  • Protocol compliance tests following BEP standards
  • Performance and load testing capabilities
  • Error handling validation for malformed requests

Architecture Details

The backend implements a SwarmManager class that provides thread-safe operations for tracking BitTorrent swarms and peers. The application supports three protocols simultaneously:

  1. HTTP (BEP-3) - Standard BitTorrent announce protocol
  2. UDP (BEP-15) - Low-overhead UDP tracker protocol
  3. WebSocket - Real-time bidirectional communication for modern clients

The frontend provides a comprehensive dashboard with real-time metrics, swarm monitoring, and tracker status visualization. All three protocols contribute to the same pool of peers, allowing for cross-protocol aggregation.