Skip to content

oleggur/personal-expense-tracker

Repository files navigation

Personal Expense Tracker

Problem Description

Many people struggle to keep track of their personal finances - where their money comes from and where it goes. Without proper tracking, it's difficult to understand spending patterns, identify areas for savings, or maintain a budget. Existing solutions are often either too complex (full accounting software with features most individuals don't need) or too simple (basic spreadsheets that lack structure and analysis capabilities).

This application solves this problem by providing a simple, user-friendly web interface for tracking income and expenses. Users can organize transactions by custom categories, view statistics to understand their spending habits, and export their data for further analysis. Each user has their own private account, ensuring data privacy and personalization.

Features

  • User Authentication: Secure registration and login system with JWT-based authentication
  • Category Management: Create and manage custom income and expense categories
  • Transaction Tracking: Record income and expenses with descriptions, amounts, and dates
  • Filtering & Search: Filter transactions by category, date range, and type
  • Statistics Dashboard: Visual breakdown of spending by category with interactive charts
  • Time Period Analysis: View financial summaries for specific date ranges
  • CSV Export: Export all transaction data to CSV for external analysis
  • Multi-User Support: Complete data isolation between user accounts

AI-Assisted Development

This project was built using AI-assisted development methodology with Claude Code and Cursor.

Development Workflow

Phase 1: Planning

  • Created comprehensive implementation specification (prompt.md) defining all requirements, success criteria, and tech stack
  • Broke down the project into 15 self-contained implementation steps
  • Used Claude Code (Claude Sonnet 4.5) to refine the specification for logical consistency and clarity

Phase 2: Implementation

  • Fed the specification to Claude Code step-by-step
  • Claude Code generated all application code following the detailed implementation guide
  • Each step was verified before proceeding to the next

Tools Used

  • Claude Code (Claude Sonnet 4.5) - AI-powered CLI for code generation and development assistance
  • Cursor - AI-powered IDE used alongside Claude Code for interactive development and code editing
  • MCP (Model Context Protocol): IDE Diagnostics server integrated with VS Code
    • Provides real-time TypeScript/Python errors and warnings directly to Claude Code
    • Used after writing/editing code to catch type errors, linting issues, and syntax problems before running tests
    • Enables faster development iteration by surfacing IDE-level diagnostics in the AI workflow

Code Generation

Claude Code and Cursor generated 100% of the application code:

  • Backend: FastAPI application, SQLAlchemy models, authentication system, API routers
  • Frontend: React components, TypeScript interfaces, API clients, UI pages
  • Testing: pytest tests (backend), Vitest tests (frontend), Playwright E2E tests
  • Infrastructure: Dockerfiles, docker-compose configuration, nginx setup
  • CI/CD: GitHub Actions workflows for automated testing and deployment

Technologies Used

Frontend

  • React 18 with Vite - Fast, modern frontend framework and build tool
  • TypeScript - Type-safe JavaScript for better code quality
  • Axios - HTTP client for API communication with interceptors for authentication
  • React Router - Client-side routing and navigation
  • Recharts - Data visualization library for statistics charts
  • Vitest + React Testing Library - Unit and component testing

Backend

  • FastAPI - Modern Python web framework with automatic OpenAPI documentation
  • SQLAlchemy - SQL toolkit and ORM for database operations
  • Alembic - Database migration tool
  • python-jose + passlib - JWT token generation and password hashing
  • pytest - Testing framework with async support

Database

  • SQLite - Lightweight database for local development
  • PostgreSQL - Production-grade relational database (on Render)

DevOps & Infrastructure

  • Docker + docker-compose - Containerization for consistent environments
  • GitHub Actions - CI/CD pipeline for automated testing and deployment
  • Render - Cloud platform for hosting (PostgreSQL + Web Service + Static Site)
  • Playwright - End-to-end testing framework

System Architecture

The Personal Expense Tracker follows a three-tier architecture:

Component Overview

┌─────────────────┐
│  React Frontend │  (Port 5173 dev / Port 80 prod)
│   (Vite + TS)   │
└────────┬────────┘
         │ HTTP/REST API
         │ (JWT Authentication)
         ▼
┌─────────────────┐
│  FastAPI Backend│  (Port 8000)
│   (Python 3.12) │
└────────┬────────┘
         │ SQLAlchemy ORM
         ▼
┌─────────────────┐
│   Database      │
│ SQLite (dev) /  │
│ PostgreSQL (prod)│
└─────────────────┘

Authentication Flow

  1. Registration/Login: User submits credentials to /api/auth/register or /api/auth/login
  2. Token Generation: Backend validates credentials and returns JWT access token
  3. Protected Routes: Frontend includes token in Authorization: Bearer <token> header
  4. Token Validation: Backend validates token on each request via get_current_user dependency
  5. Data Isolation: All queries are filtered by user_id to ensure complete data privacy

API Contract

  • OpenAPI Specification: Automatically generated by FastAPI at /docs (Swagger UI) and /redoc
  • Contract-Driven Development: Frontend and backend development followed the OpenAPI spec as the contract
  • Type Safety: TypeScript interfaces match Pydantic schemas for end-to-end type safety

Docker Architecture

When running with docker-compose:

  • PostgreSQL Container: Database service with persistent volume
  • Backend Container: FastAPI application with hot-reload in development
  • Frontend Container: Nginx serving static React build
  • Network: All containers communicate via expense_tracker_network

Getting Started

Prerequisites

  • Python 3.11+ (3.12 recommended)
  • Node.js 18+ (20+ recommended)
  • PostgreSQL (optional for local development, SQLite used by default)
  • Docker and docker-compose (optional, for containerized setup)

Local Setup

1. Clone the Repository

git clone <repository-url>
cd personal-expense-tracker

2. Backend Setup

cd backend

# Create virtual environment
python -m venv venv

# Activate virtual environment
# On Linux/Mac:
source venv/bin/activate
# On Windows:
# venv\Scripts\activate

# Install dependencies
pip install -r requirements.txt

# Run database migrations (uses SQLite by default, no .env needed)
alembic upgrade head

# Start the backend server
uvicorn main:app --reload

The backend API will be available at http://localhost:8000

  • API docs: http://localhost:8000/docs
  • ReDoc: http://localhost:8000/redoc

3. Frontend Setup

Open a new terminal:

cd frontend

# Install dependencies
npm install

# Create .env file
echo "VITE_API_URL=http://localhost:8000" > .env

# Start development server
npm run dev

The frontend will be available at http://localhost:5173

Docker Setup

The easiest way to run the entire application is using Docker Compose:

# From project root
docker-compose up --build

This will:

  • Start PostgreSQL database container
  • Build and start the backend container
  • Build and start the frontend container (nginx)

Access Points:

  • Frontend: http://localhost
  • Backend API: http://localhost:8000
  • API docs: http://localhost:8000/docs

Stop the containers:

docker-compose down

View logs:

docker-compose logs -f

Testing

Backend Tests

Run all backend tests:

cd backend
python -m pytest tests/ -v

Run tests with coverage:

python -m pytest tests/ --cov=. --cov-report=term-missing

Test Results:

  • 64 tests covering all API endpoints
  • 99% code coverage
  • Tests include: authentication, CRUD operations, data isolation, filtering, validation, and statistics

Frontend Tests

Run all frontend tests:

cd frontend
npm test

End-to-End Tests

Run E2E tests with Playwright:

npm run test:e2e

Run E2E tests with UI mode (interactive):

npm run test:e2e:ui

E2E Test Coverage:

  • Authentication flow (register, login, logout)
  • Category management (create, view, delete)
  • Transaction management (create, filter, delete)
  • Statistics dashboard (view, filter dates, export CSV)

Deployment

Platform: Render

Live Application: https://personal-expense-tracker-frontend2.onrender.com

CI/CD Pipeline

Deployment is automated via GitHub Actions (.github/workflows/deploy.yml):

  1. Push to main branch triggers the deploy workflow
  2. GitHub Actions calls the Render API to trigger deployments
  3. Render builds and deploys both backend and frontend services
  4. Database migrations run automatically via the backend build command

Required GitHub Secrets:

  • RENDER_API_KEY: Get from Render Dashboard → Account Settings → API Keys
  • RENDER_BACKEND_SERVICE_ID: Found in backend service URL or settings
  • RENDER_FRONTEND_SERVICE_ID: Found in frontend service URL or settings

Render Services

The application runs on three Render services:

Service Type Configuration
PostgreSQL Database Managed Database expense_tracker database
Backend Web Service Python 3, uvicorn main:app --host 0.0.0.0 --port $PORT
Frontend Static Site npm run build, publish dist/

Environment Variables

Backend (Render Web Service):

DATABASE_URL=<Render PostgreSQL Internal URL>
SECRET_KEY=<Generate with: openssl rand -hex 32>
ALGORITHM=HS256
CORS_ORIGINS=https://your-frontend-url.onrender.com

Frontend (Render Static Site):

VITE_API_URL=https://your-backend-url.onrender.com

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors