Skip to content

Commit ba744d7

Browse files
authored
fix: failure in responses during construct metrics (#4157)
# What does this PR do? Without this we get below in server logs ``` RuntimeError: OpenAI response failed: InferenceRouter._construct_metrics() got an unexpected keyword argument 'model_id' ``` Seems the method signature got update but this callsite was not updated ## Test Plan CI and test with Sabre (Agent framework integration)
1 parent a82b79c commit ba744d7

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

src/llama_stack/core/routers/inference.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -417,7 +417,7 @@ async def stream_tokens_and_compute_metrics_openai_chat(
417417
prompt_tokens=chunk.usage.prompt_tokens,
418418
completion_tokens=chunk.usage.completion_tokens,
419419
total_tokens=chunk.usage.total_tokens,
420-
model_id=fully_qualified_model_id,
420+
fully_qualified_model_id=fully_qualified_model_id,
421421
provider_id=provider_id,
422422
)
423423
for metric in metrics:

0 commit comments

Comments
 (0)