Skip to content

Releases: agentuity/llmproxy

v0.0.11

05 May 23:13
v0.0.11
235565c

Choose a tag to compare

Changelog

  • 235565c Fix AutoRouter upstream response encoding (#13)

v0.0.10

04 May 01:30
v0.0.10
5dacc0a

Choose a tag to compare

Changelog

  • 5dacc0a Fix anthropic stream and turn off Encoding when SSE (#11)

v0.0.9

03 May 21:55
v0.0.9
b5bd117

Choose a tag to compare

Changelog

  • b5bd117 Fix provider-prefixed model routing (#10)

v0.0.8

16 Apr 03:07
v0.0.8
1bd7320

Choose a tag to compare

Changelog

  • 1bd7320 feat: Responses API streaming, WebSocket mode, and reasoning token support (#9)

v0.0.7

16 Apr 01:47
v0.0.7
94d029e

Choose a tag to compare

Changelog

  • 94d029e fix: handle multimodal content arrays and preserve non-standard message fields (#8)

v0.0.6

14 Apr 05:12
v0.0.6
b213390

Choose a tag to compare

Changelog

  • b1ae20e feat: add AutoRouter, Responses API support, and provider detection (#6)
  • b213390 feat: add SSE streaming support with billing extraction (#7)

v0.0.5

13 Apr 04:25
v0.0.5
e1eebce

Choose a tag to compare

Changelog

  • e1eebce fix: handle Anthropic-style token reporting in cached billing (#5)

v0.0.4

13 Apr 04:17
v0.0.4
12ae01a

Choose a tag to compare

Changelog

  • 12ae01a feat: split billing for cached vs non-cached prompt tokens (#4)
  • 0d87e7c fix: add omitempty to Temperature field in models.dev adapter

v0.0.3

13 Apr 02:35
v0.0.3
681061e

Choose a tag to compare

Changelog

  • 681061e feat: add prompt caching interceptor for 6 LLM providers (#3)

v0.0.2

13 Apr 01:19
v0.0.2
0c8762d

Choose a tag to compare

Changelog