Skip to content

docs: finalize AnythingLLM docker-compose and deployment guide#11

Open
mshahid538 wants to merge 1 commit intomainfrom
droplet-deployment-guide
Open

docs: finalize AnythingLLM docker-compose and deployment guide#11
mshahid538 wants to merge 1 commit intomainfrom
droplet-deployment-guide

Conversation

@mshahid538
Copy link
Contributor

No description provided.

@mshahid538 mshahid538 self-assigned this Feb 20, 2026
Copy link
Member

@romandidomizio romandidomizio left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

═══════════════════════════════════════════════════════════════════════════════
🏐 #ContextVolley | @rmn@shd | 2026-03-16 | W12
═══════════════════════════════════════════════════════════════════════════════

📐 CCC Metadata

Field Value
CCC-ID RMN_2026-W12_035
CCC Version 3.1.3.1
Context Type 📋 TASK-BLOCKER
Target @shd (DevOps)
Priority 🔴 P0
Handoff Protocol SEEK:CONFIRM → BUILD → PR-UPDATE

🎯 ISSUE SUMMARY

Your PR in weownnetwork/ai/anythingllm/docker contains ethdenver-specific config with basic documentation, .env.example that doesn't match our target variables, and no deployment script. We need a repeatable framework/template. This blocks team replication. We need generic, reusable deployment tooling for all future instances.


📦 REQUIRED FIXES

1. Remove Instance-Specific Naming

Current Required
anythingllm_ethdenver <CONTAINER_NAME> (user input)
/root/ethdenver_storage <HOST_PATH> (user input)
ETHDenver.CCC.bot <DOMAIN> (user input)

2. Create Unified Deploy Script (deploy.sh)

Script must handle:

# Pre-flight Checks
✓ Docker Engine installed on target droplet
✓ Docker Compose available
✓ User authenticated to correct droplet (SSH/DO API)

# Interactive Prompts
✓ Container name (no defaults)
✓ Host storage path
✓ Domain for exposure (Caddy/Traefik)
✓ LLM Provider (default: openrouter, NOT openai)
✓ LLM Model (user selection)
✓ Embedding Model (user selection)
✓ API Key (secure injection via Infisical or prompt)
✓ Port mapping

# Secure Env Handling
✓ Generate .env from prompts (no hardcoded values)
✓ Sanitized .env.example for GitHub
✓ Inject secrets at runtime (not baked into compose)

# Deployment Execution
✓ docker compose up -d
✓ SSL verification (Caddy auto-HTTPS)
✓ Health check confirmation

3. Reference Pattern

Use our AnythingLLM Helm Chart deploy script as template logic. Adapt for Docker Droplet context (no K8s, single-node for docker).

4. File Structure

anythingllm/docker/
├── docker-compose.yml      # Generic, no hardcoded names
├── .env.example            # Sanitized template
├── deploy.sh               # Unified deployment script
├── README.md               # Usage instructions + pre-reqs
└── .env                    # Runtime only (NOT committed)

🚫 WHAT TO REMOVE

File Issue Action
docker-compose.yml Hardcoded anythingllm_ethdenver Replace with variable injection
.env.example LLM_PROVIDER=openai Change default to openrouter
README.md ethdenver-specific docs Generalize for any instance

✅ ACCEPTANCE CRITERIA

# Criteria Status
1 Script runs on any DO Docker droplet
2 No hardcoded instance names in config
3 Prompts for all required variables
4 Secure API key handling (Infisical-ready)
5 OpenRouter as default provider
6 README covers pre-reqs + usage
7 Single script (no fragmented commands)

📝 README.md ADDENDUM — SYSTEM UPDATES & CONFIG

Purpose: Standardize update procedures, resource requirements, and extension configs (MCP/Env).
Location: Append to README.md or create docs/DEPLOYMENT.md.


🔄 System Updates & Configuration

1. Automated Update Script

To ensure all services are pulled and restarted cleanly, use the provided update script.

# ./scripts/update.sh
#!/bin/bash
echo "🔄 Pulling latest changes..."
git pull origin main
echo "🐳 Restarting containers..."
docker compose down
docker compose up -d --pull always
echo "✅ Update complete."

Usage:

chmod +x scripts/update.sh
./scripts/update.sh

2. AnythingLLM (Docker Self-Hosted)

We utilize AnythingLLM for document retrieval and agent context with offloaded inference and embedding. Ensure your host meets minimum requirements before deployment.

Resource Minimum Recommended
RAM 2 GB 4 GB+
CPU 1 Cores 2 Cores+
Storage 10 GB 50 GB+ (SSD)

📖 Official Docs: AnythingLLM Docker Requirements

3. Environment Variables

Configure core system behavior via .env. Key variables include as a start, for example:

# Community Hub Configuration
# Enable agent skill imports from AnythingLLM Hub
# "1" = Allow verified/private items only (recommended for enterprise)
# "allow_all" = Allow all items including unverified (not recommended)
COMMUNITY_HUB_BUNDLE_DOWNLOADS_ENABLED: "1"  # Enterprise security: verified items only

# MCP Configuration
MCP_SERVER_ENABLED=true
MCP_CONFIG_PATH=/etc/mcp/servers.json

4. Custom MCP Servers

To add custom Model Context Protocol (MCP) servers, edit the MCP configuration file, for example:

File: /etc/mcp/servers.json

{
  "mcpServers": {
    "custom-tool": {
      "command": "node",
      "args": ["/app/tools/custom-server.js"],
      "env": {
        "API_KEY": "${YOUR_API_KEY}"
      }
    }
  }
}

Restart Required: After modifying MCP configs, restart the agent service:

docker compose restart agent

📬 HANDOFF

Action Required:

  1. Review this volley
  2. Update PR with generic config + deploy.sh
  3. Remove ethdenver-specific references
  4. Test against fresh droplet (not ethdenver instance)
  5. Tag @rmn for review before merge

Blocker Status: 🔴 PR cannot merge until corrected

═══════════════════════════════════════════════════════════════════════════════

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants