-
Notifications
You must be signed in to change notification settings - Fork 1
Open
Labels
Description
Problem
Temperature is currently hardcoded to 0.0 in LLMClient:
this.config = {
temperature: 0.0, // hardcoded!
maxTokens: 2000,
timeout: 120000,
...config,
};While 0.0 is good for deterministic extraction, some use cases might benefit from slightly higher values for:
- Creative interpretation of ambiguous text
- Fuzzy matching
- Handling typos and variations
Proposed Solution
Allow temperature override at multiple levels:
- Global config (already works, but overridden):
const client = new LLMClient({
baseURL: '...',
model: '...',
temperature: 0.2, // custom default
});- Per-extraction override:
await client.extract({
schema,
input,
temperature: 0.1, // one-off override
});- Schema-level default (via metadata):
{
"metadata": {
"temperature": 0.2
}
}Implementation
Fix LLMClient constructor to respect passed temperature:
this.config = {
...config,
temperature: config.temperature ?? 0.0,
maxTokens: config.maxTokens ?? 2000,
timeout: config.timeout ?? 120000,
};Add temperature to ExtractionOptions:
interface ExtractionOptions {
schema: Schema;
input: string;
systemPrompt?: string;
temperature?: number; // add this
}Acceptance Criteria
- Config-level temperature is respected
- Per-extraction temperature override works
- Tests verify configuration precedence
- Documentation explains when to adjust temperature
- Default remains 0.0 for determinism
Related
Parent: #30
Reactions are currently unavailable