Skip to content

Commit ec6d0fd

Browse files
docs: link to OpenRouter docs and remove duplicated content
- Replace local docs link with https://openrouter.ai/docs/features/prompt-caching - Remove duplicated Effect AI usage examples and notes - Keep README minimal to avoid content going stale - Code examples are in the actual .ts files
1 parent bd8d00f commit ec6d0fd

File tree

1 file changed

+1
-38
lines changed
  • typescript/effect-ai/src/prompt-caching

1 file changed

+1
-38
lines changed

typescript/effect-ai/src/prompt-caching/README.md

Lines changed: 1 addition & 38 deletions
Original file line numberDiff line numberDiff line change
@@ -5,45 +5,8 @@ Examples demonstrating prompt caching with @effect/ai and @effect/ai-openrouter.
55
## Documentation
66

77
For full prompt caching documentation including all providers, pricing, and configuration details, see:
8-
- **[Prompt Caching Guide](../../../../docs/prompt-caching.md)**
8+
- **[OpenRouter Prompt Caching Guide](https://openrouter.ai/docs/features/prompt-caching)**
99

1010
## Examples in This Directory
1111

1212
See the TypeScript files in this directory for specific examples.
13-
14-
## Effect AI Usage
15-
16-
```typescript
17-
import * as OpenRouterLanguageModel from '@effect/ai-openrouter/OpenRouterLanguageModel';
18-
19-
const OpenRouterModelLayer = OpenRouterLanguageModel.layer({
20-
model: 'anthropic/claude-3.5-sonnet',
21-
config: {
22-
stream_options: { include_usage: true }, // Required for cache metrics
23-
},
24-
});
25-
26-
const program = Effect.gen(function* () {
27-
const response = yield* LanguageModel.generateText({
28-
prompt: Prompt.make([{
29-
role: 'user',
30-
content: [{
31-
type: 'text',
32-
text: 'Large context...',
33-
options: {
34-
openrouter: { cacheControl: { type: 'ephemeral' } }
35-
}
36-
}]
37-
}])
38-
});
39-
40-
// Check cache metrics
41-
const cached = response.usage.cachedInputTokens ?? 0;
42-
});
43-
```
44-
45-
## Effect-Specific Notes
46-
47-
- Use layer-based dependency injection for client and model configuration
48-
- `stream_options.include_usage` must be set in the model layer config
49-
- Cache metrics appear in `response.usage.cachedInputTokens`

0 commit comments

Comments
 (0)