Skip to content

haohervchb/Goose_code

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

206 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

goosecode

A claude code like thingy, Vibed by GPT-5.4 with C, just for fun.

goosecode is a local AI coding agent with a Go-based TUI (terminal user interface) and C backend. It features an OpenAI-compatible API client, tool loop, slash commands, sessions, subagents, MCP support, and a terminal-first workflow.

It works with:

  • local OpenAI-compatible servers such as Ollama, vLLM, llama.cpp, ik-llama, LM Studio, text-generation-webui proxies, or custom gateways
  • hosted OpenAI-compatible providers such as OpenAI, Together, Fireworks, Groq, or self-hosted proxies

What It Can Do

  • TUI Mode (default): Interactive terminal UI with bubbletea
    • Color-coded Plan/Build mode toggle (Tab key)
    • Scrollable chat history
    • Visible cursor with mode-specific colors
  • REPL Mode (fallback): Legacy interactive REPL with multiline input
  • one-shot prompt mode from the shell
  • file editing and shell execution
  • task tracking and plan mode
  • resumable subagents and optional git worktrees
  • MCP resource listing/reading
  • LSP queries
  • local git workflow commands like /branch, /commit, and /review
  • provider presets and first-run provider setup

Current Surface Area

  • Tools: 29
  • Slash commands: 17

Main tools include:

  • bash
  • read_file, write_file, edit_file
  • glob_search, grep_search
  • web_fetch, web_search
  • todo_write, task_create, task_get, task_list, task_update
  • ask_user_question
  • enter_plan_mode, exit_plan_mode
  • agent
  • list_mcp_resources, read_mcp_resource
  • lsp
  • repl, powershell

Main slash commands include:

  • /help
  • /model
  • /provider
  • /session
  • /compact
  • /plan
  • /config
  • /tasks
  • /branch
  • /commit
  • /review
  • /subagents
  • /permissions
  • /tools
  • /exit

Build

Requirements:

  • gcc with C11 support
  • libcurl
  • pthread support from libc

The project vendors cJSON, so you do not need to install it separately.

Debian / Ubuntu

sudo apt update
sudo apt install build-essential libcurl4-openssl-dev

macOS

brew install curl

If Homebrew curl is not on your default compiler path, set the include/library flags yourself before building.

Build Commands

make        # Build goosecode (TUI) and goosecode-backend
make tui    # Build only the TUI
make clean  # Clean build artifacts
make install
make uninstall

Build output:

  • ./goosecode - TUI launcher (symlink to goosecode-tui)
  • ./goosecode-tui - TUI binary (Go)
  • ./goosecode-backend - C backend

User-local install path by default:

~/.local/bin/goosecode

If ~/.local/bin is on your PATH, you can then start goosecode from anywhere with:

goosecode

Custom install path example:

make install INSTALL_BINDIR=/usr/local/bin

Usage

Interactive TUI (default)

./goosecode

TUI features:

  • Press Tab to toggle between PLAN and BUILD mode
  • Cursor color changes: yellow (PLAN), green (BUILD)
  • Separator line color matches mode
  • Type messages or use slash commands (e.g., /help, /exit)

Legacy REPL mode

./goosecode --repl

One-shot Prompt

./goosecode "explain the architecture of this project"

Common CLI Flags

--provider <provider>
--model <model>
--base-url <url>
--permission <mode>
--max-turns <n>
--session <id>
--help

Example Invocations

# interactive
./goosecode

# one-shot
./goosecode "write a fibonacci function in C"

# override model for one run
./goosecode --model gpt-4o-mini "summarize this repository"

# choose a provider preset for one run
./goosecode --provider ollama

# resume a saved session
./goosecode --session 1775092052_390207107

# set permissive mode for a local sandbox session
./goosecode --permission allow

Connecting To Providers

goosecode talks to any server that exposes an OpenAI-compatible /v1 API.

Built-in Provider Presets

You can switch providers directly inside the REPL:

/provider list
/provider set openai
/provider set ollama
/provider set vllm
/provider set llama.cpp
/provider set ik-llama
/provider test
/model list
/model set <name>

/provider set ... now prompts for:

  • base URL
  • model name
  • API key when needed or optional

Provider settings are saved per provider, so switching back to a previous provider restores its last saved base_url, model, and api_key instead of overwriting everything.

First-run behavior:

  • if no environment variables or settings files are present, goosecode opens a guided provider setup flow before the REPL starts
  • the REPL banner also shows the active provider, model, and base URL so it is obvious what you are connected to

Environment Variables

export OPENAI_BASE_URL=...
export OPENAI_MODEL=...
export OPENAI_API_KEY=...

OPENAI_API_KEY is optional for many local servers.

Local Providers

Use this when your model server is on the same machine or LAN.

Examples:

# local gateway on port 8083
export OPENAI_BASE_URL=http://localhost:8083/v1
export OPENAI_MODEL=cyankiwi/Qwen3.5-122B-A10B-AWQ-8bit
./goosecode

# Ollama
export OPENAI_BASE_URL=http://localhost:11434/v1
export OPENAI_MODEL=llama3
./goosecode

# vLLM
export OPENAI_BASE_URL=http://localhost:8000/v1
export OPENAI_MODEL=your-model-name
./goosecode

# llama.cpp or ik-llama
export OPENAI_BASE_URL=http://localhost:8080/v1
export OPENAI_MODEL=your-model-name
./goosecode

# or use the built-in provider presets interactively
./goosecode
# then run /provider set ollama or /provider set vllm

Notes:

  • many local servers ignore OPENAI_API_KEY
  • the base URL should usually end in /v1
  • model names must match what your server exposes

External Providers

Use this when talking to hosted services.

Examples:

# OpenAI
export OPENAI_BASE_URL=https://api.openai.com/v1
export OPENAI_API_KEY=sk-...
export OPENAI_MODEL=gpt-4o
./goosecode

# Together / Fireworks / Groq / any compatible host
export OPENAI_BASE_URL=https://your-provider.example/v1
export OPENAI_API_KEY=...
export OPENAI_MODEL=provider-model-name
./goosecode

Notes:

  • if requests fail, first confirm the endpoint is OpenAI-compatible
  • if streaming behaves oddly, test a simple non-tool prompt first
  • hosted providers usually require OPENAI_API_KEY

Settings Files

Configuration is loaded from:

  • ~/.goosecode/settings.json
  • .goosecode/settings.json

Project settings override user settings where applicable.

Example project config:

{
  "provider": "vllm",
  "base_url": "http://localhost:8083/v1",
  "model": "cyankiwi/Qwen3.5-122B-A10B-AWQ-8bit",
  "permission_mode": "allow",
  "max_turns": 64
}

User settings also keep a provider_profiles map internally so each provider can remember its own last-used values.

Permissions

Supported permission modes:

  • read-only
  • workspace-write
  • danger-full-access
  • prompt
  • allow

Examples:

./goosecode --permission read-only
./goosecode --permission allow

Environment override:

export GOOSECODE_PERMS=allow

Useful REPL Notes

Editor controls:

  • Left / Right: move cursor
  • Up / Down: history recall
  • Tab: complete slash commands
  • Ctrl+A: start of line
  • Ctrl+E: end of line
  • Ctrl+J: insert newline into the current prompt

Examples:

/provider list
/provider set ollama
/provider test
/model list
/model set llama3
/tasks create investigate parser failure
/plan set
1. Reproduce
2. Fix
.
/review

Bash Tool Timeout

The bash tool supports a configurable timeout in seconds.

That matters for commands like:

  • make
  • cargo build
  • docker build
  • long-running test suites

Example tool call shape:

{
  "command": "docker build -t app .",
  "timeout": 1800
}

Notes:

  • default timeout: 120 seconds
  • maximum timeout: 7200 seconds
  • both numeric and numeric-string timeout values are accepted

Sessions, Tasks, and Subagents

Examples:

/session
/tasks
/tasks create add logging around API failures
/subagents

Stored state lives under:

  • ~/.goosecode/sessions
  • ~/.goosecode/subagents
  • ~/.goosecode/worktrees
  • ~/.goosecode/todos.json

MCP and LSP

MCP

Configure MCP servers in settings:

{
  "mcp_servers": [
    {
      "name": "test",
      "command": "/usr/bin/python3",
      "args": ["/path/to/mcp_server.py"]
    }
  ]
}

Supported MCP tools today:

  • list_mcp_resources
  • read_mcp_resource

LSP

Supported LSP actions today:

  • hover
  • definition
  • document_symbols

The lsp tool can use:

  • default server selection for supported file types
  • explicit server_command and server_args

Architecture

src/
├── main.c              # Entry point and CLI setup
├── agent.c/h           # REPL + turn loop + tool execution
├── api.c/h             # OpenAI-compatible API client
├── config.c/h          # Env + settings file loading
├── session.c/h         # Session persistence
├── permissions.c/h     # Permission checks
├── prompt.c/h          # System prompt assembly
├── commands/           # Slash commands
├── tools/              # Tool implementations
└── util/               # JSON, SSE, terminal, markdown, buffers, HTTP

Development Notes

Helpful commands:

make test
./goosecode --help
./goosecode --permission allow

When testing interactively, prefer a sandbox working directory instead of the main source tree.

License

MIT

About

A claude code like thingy, Vibed by GPT-5.4 with C, just for fun

Resources

Stars

Watchers

Forks

Packages

 
 
 

Contributors

Languages