Skip to content

Ibra7hi/probability--engine

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

1 Commit
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Probabilistic Decision Engine

A FastAPI-based engine that computes the success probability of startup ideas using LLM feature extraction combined with rigorous, math-only weighted Bayesian scoring.

Features

  • LLM Feature Extraction: Extracts 10+ standardized features (novelty, competition, readiness, etc.) from raw natural language startup ideas.
  • Multi-Provider AI: Easily toggle between Anthropic, OpenAI, or OpenRouter just by updating your .env file.
  • Bayesian Scoring: Computes the actual probability using a pure mathematical confidence intervals based on established startup research data (no LLM guessing).
  • Evidence-Based Updates: Log milestones, pivots, or setbacks as "evidence" and let the engine auto-adjust the probability mathematically using Sequential Bayesian Updating.
  • High Performance: Asynchronous PostgreSQL with SQLAlchemy and Redis caching makes queries lightning fast.

Architecture

  1. API Layer: FastAPI endpoints for submitting ideas and tracking history.
  2. LLM Provider Factory: Connects to Claude, ChatGPT, or OpenRouter for pure data extraction.
  3. ML Engine: bayesian_updater.py and calibration tools run the math based on feature_weights.json.
  4. Storage: Postgres for persistence, Redis for score history and rapid cache lookups.
  5. Event Streaming: Redis Pub/Sub streams for async task processing.

Running Locally

1. Prerequisites

  • Docker & Docker Compose
  • Python 3.12+

2. Configuration

Copy the .env template:

cp .env.example .env

Add your API keys inside .env:

LLM_PROVIDER=anthropic      # Choose: anthropic, openai, openrouter
ANTHROPIC_API_KEY=your_key  # Required if using anthropic
OPENAI_API_KEY=your_key     # Required if using openai

3. Start Database & Cache Services

docker compose up -d postgres redis

4. Create Virtual Environment & Install

python3 -m venv venv
source venv/bin/activate
pip install -r requirements.txt

5. Run the Engine

uvicorn app.main:app --reload

Interactive API Documentation will be available at: http://localhost:8000/api/v1/docs

Docker Deployment (Full)

To run the entire stack (API, database, and cache) inside Docker:

docker compose -f docker-compose.yml up --build

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors