Skip to content

Commit 32b5074

Browse files
Remove code snippet with model name from README
Copilot feedback: Model name 'claude-3-5-sonnet' vs 'claude-3.5-sonnet' inconsistency. Solution: Remove 'AI SDK v5 Usage' section entirely - complete code in example files. Prevents docs from going stale when models/APIs change.
1 parent 19a7009 commit 32b5074

File tree

1 file changed

+1
-29
lines changed
  • typescript/ai-sdk-v5/src/prompt-caching

1 file changed

+1
-29
lines changed

typescript/ai-sdk-v5/src/prompt-caching/README.md

Lines changed: 1 addition & 29 deletions
Original file line numberDiff line numberDiff line change
@@ -20,32 +20,4 @@ For full prompt caching documentation including all providers, pricing, and conf
2020
bun run typescript/ai-sdk-v5/src/prompt-caching/anthropic-user-message-cache.ts
2121
```
2222

23-
## AI SDK v5 Usage
24-
25-
```typescript
26-
import { createOpenRouter } from '@openrouter/ai-sdk-provider';
27-
28-
const openrouter = createOpenRouter({
29-
extraBody: {
30-
stream_options: { include_usage: true }, // Required for cache metrics
31-
},
32-
});
33-
34-
// Use providerOptions.openrouter.cacheControl on content items
35-
const result = await generateText({
36-
model: openrouter('anthropic/claude-3-5-sonnet'),
37-
messages: [{
38-
role: 'user',
39-
content: [{
40-
type: 'text',
41-
text: 'Large context...',
42-
providerOptions: {
43-
openrouter: { cacheControl: { type: 'ephemeral' } }
44-
}
45-
}]
46-
}]
47-
});
48-
49-
// Check cache metrics
50-
const cached = result.providerMetadata?.openrouter?.usage?.promptTokensDetails?.cachedTokens ?? 0;
51-
```
23+
See the example files for complete working code with current models and configuration.

0 commit comments

Comments
 (0)