Skip to content

thevrus/Wiki-Forge

Repository files navigation

Wiki Forge

Your codebase remembers. Even when your team forgets.

Local-first Works offline Your code stays yours

Wiki Forge is a decision provenance engine. It compiles your source code, git history, and PRs into a living knowledge base — not just docs, but the why behind every engineering decision. When code changes, it detects drift and rewrites only what changed.

Wiki Forge Architecture — inputs, two-pass compilation, three output layers


Why

Software engineering has 23-25% annual turnover. Each departure costs 4-8 weeks of delivery time — not because the code is lost, but because the context behind it is. Why was this module built this way? What incident prompted the retry logic? Who knows how the payment flow actually works?

Wiki Forge treats institutional memory as a compiled artifact. The code + git history is the source of truth, the LLM is the compiler, and three types of output serve three audiences:

Output Audience What it contains
Wiki pages PMs, designers, new engineers Business rules, architecture, decision context
AI context files Claude Code, Cursor, Copilot CLAUDE.md, AGENTS.md, llms.txt
Knowledge risk reports Engineering managers Bus factor per module, onboarding readiness
Manual Docs RAG / Chatbot Google Code Wiki Wiki Forge
Output you own
Version-controlled
Works offline / air-gapped
Decision archaeology
Auto-updates
Reviewable as a PR
Cost time per query freemium per compile

Your code stays yours

With --provider local, Wiki Forge pipes prompts through your local claude or ollama CLI. Nothing leaves your machine.

Provider Where your code goes
gemini / claude / openai API call to vendor
local Stays on your machine, always

Three ways to use it

1. GitHub Action (for CI/CD)

Automatically compiles docs on every push to main:

# .github/workflows/docs-sync.yml
name: Docs Sync

on:
  push:
    branches: [main]
    paths-ignore: ["docs/**"]

permissions:
  contents: write
  pull-requests: write

jobs:
  compile:
    runs-on: ubuntu-latest
    steps:
      - uses: actions/checkout@v4
        with:
          fetch-depth: 0

      - uses: thevrus/wiki-forge@v1
        id: forge
        with:
          api_key: ${{ secrets.GEMINI_API_KEY }}

      - uses: peter-evans/create-pull-request@v7
        if: steps.forge.outputs.docs_changed == 'true'
        with:
          title: "docs: sync with codebase changes"
          body: |
            **Updated:** ${{ steps.forge.outputs.updated_docs }}

            ${{ steps.forge.outputs.diff_summary }}
          branch: docs/auto-sync
          delete-branch: true

2. Claude Code slash commands

npx wiki-forge install-commands

That's it. No cloning, no config. Now open any project in Claude Code and type /wf-init.

Then in any project:

/wf-init                     # interview + scan → creates .doc-map.json
/wf-compile --force          # compile all docs from scratch
/wf-compile                  # incremental — only recompile drifted docs
/wf-check                    # preview what drifted (read-only)
/wf-health                   # check human-written docs for contradictions
/wf-query "how do fees work" # ask questions, save answers as wiki pages
/wf-brief                    # weekly shipping brief for leadership

No API key needed — Claude Code is the LLM.

3. MCP server (for AI assistants)

Give Claude direct access to your compiled wiki. One server, reads whatever repos it's pointed at:

# Local repo (engineer with git checkout)
claude mcp add wiki-forge -- wiki-forge-mcp --repo ./docs

# Remote repo (PM, no git needed)
claude mcp add wiki-forge -- wiki-forge-mcp --github acme/platform

# Multi-repo
claude mcp add wiki-forge -- wiki-forge-mcp --repo /path/to/repo1/docs --repo /path/to/repo2/docs

Claude gets four tools:

Tool What it does
wiki_forge_why "Why is this file this way?" — maps source file to wiki page, returns decision context
wiki_forge_who "Who has context?" — returns ranked contributors with ownership % and bus factor
wiki_forge_search Search the compiled wiki by keyword — returns matching pages with excerpts
wiki_forge_status Brain health dashboard — coverage metrics, knowledge risk, action items

No LLM calls, no database, no accounts. The MCP server reads markdown files — the compiled wiki is the product, the server is just the read layer. Works across all Claude surfaces: Claude Code, Cowork, Desktop.


What it produces

docs/
  .doc-map.json           # config — maps docs to source directories
  INDEX.md                # master index with summaries
  _status.md              # brain health dashboard
  ARCHITECTURE.md         # compiled docs (user-defined)
  entities/               # auto-extracted services, APIs, models
  concepts/               # auto-extracted cross-cutting themes
  _reports/               # weekly engineering digests

Each compiled doc includes YAML frontmatter, Mermaid diagrams, inline citations, and is written for PMs — not engineers.

How it works — init interview, two-pass compilation, structured output, queries, health checks

Configuration & CLI reference — doc map schema, LLM providers, all commands and flags


Examples

See examples/ for:


Used by

Using Wiki Forge? Open a PR to add your project here.


Contributing

See CONTRIBUTING.md for guidelines.

License

MIT

About

Wiki Forge compiles your source code into a plain-language wiki that PMs, designers, and new engineers can actually understand. When code changes, it detects drift and rewrites only what changed.

Topics

Resources

License

Stars

Watchers

Forks

Packages

 
 
 

Contributors