Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
24 changes: 22 additions & 2 deletions .env.example
Original file line number Diff line number Diff line change
@@ -1,4 +1,24 @@
# LLM enrichment for extract (optional)
# Required only when using `git mem extract --enrich`
# LLM enrichment for extract and intent extraction (optional)
# git-mem auto-detects the provider from whichever key is set.
# Only one provider is needed.

# Anthropic (Claude) — default provider
# Get your key at: https://console.anthropic.com/
ANTHROPIC_API_KEY=

# OpenAI (GPT) — requires: npm install openai
# Get your key at: https://platform.openai.com/api-keys
OPENAI_API_KEY=

# Google Gemini — requires: npm install @google/generative-ai
# Get your key at: https://aistudio.google.com/apikey
GOOGLE_API_KEY=
# Alternative env var name:
# GEMINI_API_KEY=

# Ollama (local) — no API key needed, no extra package
# Just set the host if not using the default localhost:11434
# OLLAMA_HOST=http://localhost:11434

# Force a specific provider (overrides auto-detection)
# GIT_MEM_LLM_PROVIDER=anthropic
18 changes: 17 additions & 1 deletion CLAUDE.md
Original file line number Diff line number Diff line change
Expand Up @@ -65,7 +65,23 @@ Uses **`node:test`** (native Node.js test runner) with **`tsx`** for TypeScript,

## Environment Variables

- `ANTHROPIC_API_KEY` — Required only for `git mem extract --enrich` (LLM enrichment). Without it, `--enrich` falls back to heuristic extraction with a warning. See `.env.example`.
- `ANTHROPIC_API_KEY` — Anthropic (Claude) API key for LLM enrichment and intent extraction.
- `OPENAI_API_KEY` — OpenAI API key (requires `npm install openai`).
- `GOOGLE_API_KEY` / `GEMINI_API_KEY` — Google Gemini API key (requires `npm install @google/generative-ai`).
- `OLLAMA_HOST` — Ollama server URL (default: `http://localhost:11434`). No extra package needed.
- `GIT_MEM_LLM_PROVIDER` — Force a specific provider: `anthropic`, `openai`, `gemini`, or `ollama`. Auto-detected from API keys if omitted.

Only one provider is needed. Without any LLM key, `--enrich` falls back to heuristic extraction with a warning. See `.env.example`.

**LLM config in `.git-mem/.git-mem.yaml`:**

```yaml
llm:
provider: openai # auto-detected if omitted
model: gpt-4o # provider default if omitted
intentModel: gpt-4o-mini
baseUrl: http://localhost:11434 # for ollama
```
Comment thread
coderabbitai[bot] marked this conversation as resolved.

## Key Technical Details

Expand Down
4 changes: 2 additions & 2 deletions package-lock.json

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

8 changes: 8 additions & 0 deletions package.json
Original file line number Diff line number Diff line change
Expand Up @@ -49,6 +49,14 @@
"prompts": "^2.4.2",
"yaml": "^2.8.2"
},
"peerDependencies": {
"openai": ">=4.0.0",
"@google/generative-ai": ">=0.21.0"
},
"peerDependenciesMeta": {
"openai": { "optional": true },
"@google/generative-ai": { "optional": true }
},
"devDependencies": {
"@semantic-release/changelog": "^6.0.3",
"@semantic-release/git": "^10.0.1",
Expand Down
17 changes: 16 additions & 1 deletion src/domain/interfaces/IHookConfig.ts
Original file line number Diff line number Diff line change
Expand Up @@ -55,7 +55,7 @@ export interface ICommitMsgConfig {
readonly requireType: boolean;
/** Default memory lifecycle. */
readonly defaultLifecycle: 'permanent' | 'project' | 'session';
/** Enable LLM enrichment for richer trailer content. Requires ANTHROPIC_API_KEY. */
/** Enable LLM enrichment for richer trailer content. Requires a configured LLM provider (e.g., ANTHROPIC_API_KEY, OPENAI_API_KEY, GOOGLE_API_KEY, or OLLAMA_HOST). */
readonly enrich: boolean;
/** Timeout in ms for LLM enrichment call. Default: 8000. Must be under hook timeout (10s). */
readonly enrichTimeout: number;
Expand All @@ -70,6 +70,21 @@ export interface IHooksConfig {
readonly commitMsg: ICommitMsgConfig;
}

/** Supported LLM providers. */
export type LLMProvider = 'anthropic' | 'openai' | 'gemini' | 'ollama';

export interface ILLMConfig {
/** Explicit provider selection. Auto-detected from env if omitted. */
readonly provider?: LLMProvider;
/** Model for enrichment. Provider default if omitted. */
readonly model?: string;
/** Lighter model for intent extraction. Reserved — not yet wired to handler. */
readonly intentModel?: string;
/** Base URL override (e.g., for Ollama). */
readonly baseUrl?: string;
}

export interface IHookConfig {
readonly hooks: IHooksConfig;
readonly llm?: ILLMConfig;
}
25 changes: 25 additions & 0 deletions src/domain/interfaces/ILLMCaller.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,25 @@
/**
* ILLMCaller
*
* Generic LLM completion interface used by services that need simple
* text-in/text-out LLM calls (e.g., IntentExtractor).
* Implemented by BaseLLMClient so any provider can serve as a caller.
*/

export interface ILLMCallerOptions {
/** System prompt for the LLM. */
readonly system: string;
/** User message to send. */
readonly userMessage: string;
/** Maximum tokens for the response. */
readonly maxTokens: number;
}

export interface ILLMCaller {
/**
* Send a simple completion request to the LLM.
* @param options - System prompt, user message, and token limit.
* @returns The LLM's text response.
*/
complete(options: ILLMCallerOptions): Promise<string>;
}
4 changes: 3 additions & 1 deletion src/domain/interfaces/ILLMClient.ts
Original file line number Diff line number Diff line change
Expand Up @@ -7,6 +7,7 @@

import type { MemoryType } from '../entities/IMemoryEntity';
import type { ConfidenceLevel } from '../types/IMemoryQuality';
import type { ILLMCaller } from './ILLMCaller';

/**
* Input to an LLM enrichment call for a single commit.
Expand Down Expand Up @@ -53,8 +54,9 @@ export interface ILLMEnrichmentResult {

/**
* Provider-agnostic LLM client interface.
* Extends ILLMCaller for simple text completions (used by IntentExtractor).
*/
export interface ILLMClient {
export interface ILLMClient extends ILLMCaller {
/**
* Extract structured memories from a commit's message and diff.
* @param input - Commit data and diff to analyze.
Expand Down
27 changes: 25 additions & 2 deletions src/hooks/utils/config.ts
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@
import { existsSync, readFileSync } from 'fs';
import { join } from 'path';
import { parse as parseYaml } from 'yaml';
import type { IHookConfig, IHooksConfig } from '../../domain/interfaces/IHookConfig';
import type { IHookConfig, IHooksConfig, ILLMConfig, LLMProvider } from '../../domain/interfaces/IHookConfig';

/** Directory containing git-mem configuration */
export const CONFIG_DIR = '.git-mem';
Expand Down Expand Up @@ -78,7 +78,7 @@ export function loadHookConfig(cwd?: string): IHookConfig {

const rawStop = (rawHooks.sessionStop ?? {}) as Record<string, unknown>;

return {
const result: IHookConfig = {
hooks: {
enabled: rawHooks.enabled ?? DEFAULTS.hooks.enabled,
sessionStart: {
Expand All @@ -103,7 +103,30 @@ export function loadHookConfig(cwd?: string): IHookConfig {
},
},
};

// Parse llm section if present
const rawLlm = raw.llm as Record<string, unknown> | undefined;
if (rawLlm && typeof rawLlm === 'object') {
const llmConfig: ILLMConfig = {
...(isValidProvider(rawLlm.provider) ? { provider: rawLlm.provider } : {}),
...(typeof rawLlm.model === 'string' ? { model: rawLlm.model } : {}),
...(typeof rawLlm.intentModel === 'string' ? { intentModel: rawLlm.intentModel } : {}),
Comment thread
TonyCasey marked this conversation as resolved.
...(typeof rawLlm.baseUrl === 'string' ? { baseUrl: rawLlm.baseUrl } : {}),
};
// Only attach if at least one field was parsed
if (Object.keys(llmConfig).length > 0) {
return { ...result, llm: llmConfig };
}
}

return result;
} catch {
return DEFAULTS;
}
}

const VALID_PROVIDERS: readonly LLMProvider[] = ['anthropic', 'openai', 'gemini', 'ollama'];

function isValidProvider(value: unknown): value is LLMProvider {
return typeof value === 'string' && VALID_PROVIDERS.includes(value as LLMProvider);
}
Comment thread
TonyCasey marked this conversation as resolved.
11 changes: 6 additions & 5 deletions src/infrastructure/di/container.ts
Original file line number Diff line number Diff line change
Expand Up @@ -110,18 +110,19 @@ export function createContainer(options?: IContainerOptions): AwilixContainer<IC
if (options?.enrich === false) {
return null;
}
return createLLMClient() ?? null;
return createLLMClient(options?.llm) ?? null;
}).singleton(),

intentExtractor: asFunction(() => {
// Intent extraction requires an API key. Return null for graceful degradation.
const apiKey = process.env.ANTHROPIC_API_KEY;
if (!apiKey) {
// Intent extraction uses the LLM client as ILLMCaller.
// If no LLM client is available, return null for graceful degradation.
const client = container.cradle.llmClient;
if (!client) {
return null;
}
try {
return new IntentExtractor({
apiKey,
caller: client,
logger: container.cradle.logger,
});
Comment thread
coderabbitai[bot] marked this conversation as resolved.
Comment thread
TonyCasey marked this conversation as resolved.
} catch {
Expand Down
3 changes: 3 additions & 0 deletions src/infrastructure/di/types.ts
Original file line number Diff line number Diff line change
Expand Up @@ -28,6 +28,7 @@ import type { ITrailerService } from '../../domain/interfaces/ITrailerService';
import type { IAgentResolver } from '../../domain/interfaces/IAgentResolver';
import type { IHookConfigLoader } from '../../domain/interfaces/IHookConfigLoader';
import type { IIntentExtractor } from '../../domain/interfaces/IIntentExtractor';
import type { ILLMConfig } from '../../domain/interfaces/IHookConfig';

export interface ICradle {
// Infrastructure
Expand Down Expand Up @@ -64,4 +65,6 @@ export interface IContainerOptions {
enrich?: boolean;
/** Scope label for child logger (e.g., 'remember', 'mcp:recall'). */
scope?: string;
/** LLM provider configuration from hook config or CLI options. */
llm?: ILLMConfig;
}
Comment thread
TonyCasey marked this conversation as resolved.
Loading