A small desktop GUI app to upload files, preview data, generate basic statistics, create embeddings (via Ollama), and ask data-driven questions using retrieval-augmented generation (RAG).
- Upload and preview files
- Basic dataset summary and numeric statistics
- Generate embeddings for efficient retrieval
- Ask questions or click "Summarize" to get concise AI-driven insights
- Switch models from the UI (e.g., gemma3:1b, llama3.2, etc.)
- Python 3.8+
- Ollama installed and available in PATH (app will attempt to start it)
- Python packages:
customtkinter,requests,numpy(install with pip)
- Ensure Ollama is installed and accessible (see https://ollama.ai).
- Install Python packages:
pip install customtkinter requests numpy - Run the app:
python main.py - In the UI: Upload a file → (optional - use for large files) Generate Embeddings → Ask questions or click Summarize
- Remember to replace OLLAMA_ADDRESS to the relevant address with port.
- This app supports only csvs at the moment. More support will be added later.
- Generating embeddings enables full-data retrieval for more accurate answers; without them the app uses sample rows.
- The app starts/controls an Ollama server process via
OllamaServer.pyand usesollamafor embeddings and chat.
Summarizer.py- GUI and core logicCSVParser.py- CSV parsing, chunking, embeddings, simple statsOllamaServer.py- manages the Ollama server process