Skip to content

Simple local (ollama) or Gemini prompt to provide context for provided BCV

Notifications You must be signed in to change notification settings

balain/bible-context-server

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Bible Context Server

A lightweight API server that provides contextual information for Bible verses using AI models.

Overview

Bible Context Server is a FastAPI application that offers quick Bible verse context lookup through two different AI backends:

  • Local Ollama server with Gemma 3 12B model
  • Google Gemini API (requires API key)

For any given Bible reference (book, chapter, verse), the service returns:

  • The verse content (NASB 1995 translation)
  • Author information
  • Approximate time of writing
  • Brief contextual description
  • Main themes
  • Related verses (quotations and allusions)

Sample Screenshots - Gen 3:20

Gemini

Bible Context Server Screenshot: Gen 3:20 / Gemini

Ollama

Bible Context Server Screenshot: Gen 3:20 / Ollama

Features

  • Fast API endpoints for Bible verse context lookup
  • Support for local AI inference via Ollama
  • Integration with Google Gemini API
  • HTML response formatting with Markdown support

Prerequisites

  • Python 3.13+
  • Ollama server running locally with Gemma 3 12B model (for /ol/ endpoint)
  • Google Gemini API key (for /ggl/ endpoint)

Installation

  1. Clone the repository:
git clone https://github.com/yourusername/bible-context-server.git
cd bible-context-server
  1. Set up a virtual environment:
python -m venv .venv
source .venv/bin/activate  # On Windows: .venv\Scripts\activate
  1. Install dependencies:
pip install fastapi uvicorn requests markdown google-generativeai python-dotenv
  1. Create a .env file in the project root with your Google Gemini API key:
GEMINI_API_KEY=your_api_key_here

Usage

  1. Start the Ollama server locally (if using the /ol/ endpoint)

  2. Run the FastAPI server:

uvicorn server:app --reload
  1. Access the API endpoints:
    • Ollama endpoint: http://localhost:8000/ol/{book_chapter_verse}
    • Gemini endpoint: http://localhost:8000/ggl/{book_chapter_verse}

Example:

http://localhost:8000/ol/John 3:16
http://localhost:8000/ggl/Romans 8:28

API Endpoints

/ol/{bcv}

Uses a local Ollama server with the Gemma 3 12B model to provide Bible verse context.

/ggl/{bcv}

Uses the Google Gemini API to provide Bible verse context. Requires a valid API key in the .env file.

Development

The project is built with:

License

MIT License

Contributing

  1. Fork the repository
  2. Create your feature branch (git checkout -b feature/amazing-feature)
  3. Commit your changes (git commit -m 'Add some amazing feature')
  4. Push to the branch (git push origin feature/amazing-feature)
  5. Open a Pull Request

About

Simple local (ollama) or Gemini prompt to provide context for provided BCV

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published