Secure Inference, Local Operations
A privacy-focused local AI workbench for activists, journalists, and creatives. SILO brings the power of modern AI to your desktop while keeping everything completely local. Your data never leaves your machine.
┌─────────────────────────────────────────────────────────────┐
│ SILO [STEADY] [⚙️] │
├─────────────────────────────────────────────────────────────┤
│ │
│ ┌─────────┐ ┌─────────┐ ┌─────────┐ ┌─────────┐ │
│ │ 💬 │ │ 📄 │ │ 🖼️ │ │ 📝 │ │
│ │ Chat │ │ Analyze │ │ Describe│ │ Write │ │
│ │ │ │ Document│ │ Images │ │ Content │ │
│ └─────────┘ └─────────┘ └─────────┘ └─────────┘ │
│ │
│ ┌─────────────────────────────────────────────────────┐ │
│ │ 💬 Ask anything... [Send] │ │
│ └─────────────────────────────────────────────────────┘ │
└─────────────────────────────────────────────────────────────┘
- 100% Local & Private — All AI processing happens on your device. No cloud, no telemetry, no data collection
- Hardware-Aware — Automatically detects your system capabilities and recommends appropriate models
- Pipeline System — Build custom AI workflows with a visual builder or JSON schema
- Multi-Model Support — Language models, vision models, and audio transcription
- Built-in Workflows — Pre-configured pipelines for common tasks (chat, document analysis, image description, content writing, research, summarization)
- Chat with History — Persistent conversation history with model switching
- File Processing — Analyze documents, images, and audio files locally
- Beautiful UI — Brutalist industrial design with a print-first aesthetic
- Node.js 18+
- Ollama — Install from ollama.com
# Clone the repository
git clone https://github.com/your-org/silo.git
cd silo
# Install dependencies
npm install
# Start development server
npm run dev- SILO will detect your hardware and assign a performance tier
- Follow the setup wizard to install recommended AI models
- Start chatting or select a pipeline from the home screen
SILO automatically categorizes your system into performance tiers:
| Tier | VRAM | RAM | Recommended Models |
|---|---|---|---|
| LEAN | < 6GB | < 16GB | Small models (3B params) |
| STEADY | 6-12GB | 16-32GB | Medium models (7B params) |
| HEAVY | 12-24GB | 32-64GB | Large models (14B params) |
| SURPLUS | 24GB+ | 64GB+ | Very large models (70B+ params) |
| Pipeline | Description |
|---|---|
| 💬 Chat | Free-form conversation with AI |
| 📄 Analyze Document | Extract info and summarize documents |
| 🖼️ Describe Images | Generate detailed image descriptions |
| 📝 Write Content | Create articles, emails, and more |
| 🔍 Research | Deep-dive Q&A on any topic |
| 📊 Summarize Data | Condense documents into key points |
| 🎨 Creative Brief | Generate creative concepts |
| 🎙️ Transcribe | Audio to text (coming soon) |
SILO includes three ways to create custom AI workflows:
Step-by-step wizard for simple single-step pipelines.
Form-based builder for complex workflows with multiple AI steps and conditions.
Describe what you need in plain language, and SILO generates the pipeline for you.
Direct editing of pipeline JSON for power users.
See docs/PIPELINES.md for detailed pipeline documentation.
silo/
├── electron/ # Electron main process
│ ├── main/
│ │ ├── index.ts # Main entry, IPC handlers
│ │ ├── hardware.ts # Hardware detection
│ │ ├── ollama.ts # Ollama service wrapper
│ │ └── settings.ts # Persistent settings
│ └── preload/
│ ├── index.ts # Preload script
│ └── index.d.ts # TypeScript declarations
├── src/ # Vue renderer process
│ ├── assets/
│ │ └── main.css # Design system & styles
│ ├── components/
│ │ ├── chat/ # Chat interface components
│ │ ├── flows/ # Pipeline builder components
│ │ ├── home/ # Home screen components
│ │ └── settings/ # Settings panel components
│ ├── composables/ # Vue composables
│ ├── lib/
│ │ └── flows/ # Pipeline schema & execution
│ ├── stores/ # Pinia state stores
│ ├── App.vue # Root component
│ └── main.ts # Renderer entry
├── package.json
└── electron.vite.config.ts
- Framework: Electron + Vue 3
- Build Tool: electron-vite
- State Management: Pinia
- Styling: Tailwind CSS v4
- AI Backend: Ollama
- Language: TypeScript
npm run dev # Start development server
npm run build # Build for production
npm run build:mac # Build macOS app
npm run build:win # Build Windows app
npm run build:linux # Build Linux app
npm run typecheck # Run TypeScript checksSettings are stored in ~/.silo/settings.json:
{
"theme": "dark",
"defaultLanguageModel": "llama3.2:3b",
"defaultVisionModel": "llava:7b",
"sendOnEnter": true,
"showTimestamps": true,
"maxChatHistory": 100
}SILO is designed with privacy as a core principle:
- No Network Calls: Except to your local Ollama instance
- No Telemetry: Usage data is never collected
- No Cloud: All processing happens on your machine
- Local Storage: Data stored in
~/.silo/ - Open Source: Audit the code yourself
We welcome contributions! Please see CONTRIBUTING.md for guidelines.
- Fork and clone the repository
- Install dependencies:
npm install - Start dev server:
npm run dev - Make your changes
- Run type checks:
npm run typecheck - Submit a pull request
See docs/ARCHITECTURE.md for detailed technical documentation.
- Streaming responses
- Audio transcription with Whisper
- Pipeline marketplace / sharing
- Plugin system
- Light theme
- Keyboard shortcuts
- Multi-language support
MIT License — see LICENSE for details.
- Ollama for making local LLMs accessible
- electron-ollama for Electron integration
- The open-source AI community
SILO — Your data stays with you.