A modern, scalable logging infrastructure built with Node.js, Kafka, and PostgreSQL.
The system consists of three main components:
- Log Producer - REST API for receiving logs from applications
- Log Consumer - Service that consumes logs from Kafka and stores them in PostgreSQL
- Logs API - REST API for searching, filtering, and analyzing logs
- Docker and Docker Compose
- Node.js 18+ (for local development)
-
Clone the repository:
git clone https://github.com/yourusername/spunkless.git cd spunkless -
Start the services:
docker-compose up -d
-
Send a test log:
curl -X POST http://localhost:8000/spunkless-producer-api/logs \ -H "Content-Type: application/json" \ -d '{ "service": "test-service", "level": "info", "message": "Test log message" }'
-
Query the logs API:
curl "http://localhost:8002/api/logs?service=test-service"
The producer service exposes a REST API endpoint for log ingestion:
- POST /spunkless-producer-api/logs - Submit logs to the system
The consumer service processes logs from Kafka and stores them in PostgreSQL. It doesn't expose any external ports as it operates as a background service.
The Logs API provides endpoints for searching and analyzing logs:
- GET /api/logs - Get logs with filtering and pagination
- POST /api/logs/search - Advanced search with full-text capabilities
- GET /api/stats - Get log statistics and aggregations
- GET /api/metadata - Get available services and log levels
Each service has its own directory with a dedicated package.json file. To run a service in development mode:
cd <service-directory>
npm install
npm run devThe system is designed to be horizontally scalable:
- Producer and Logs API can be scaled by adding more instances behind a load balancer
- Consumer can be scaled by adding more instances with the same consumer group ID
- Kafka and PostgreSQL can be clustered for higher availability
MIT