Skip to content

fix(chat ui): fixed llm inference chat with openclaw to work with new LiteLLM gateway #47

fix(chat ui): fixed llm inference chat with openclaw to work with new LiteLLM gateway

fix(chat ui): fixed llm inference chat with openclaw to work with new LiteLLM gateway #47

Triggered via pull request March 9, 2026 06:25
Status Success
Total duration 2m 59s
Artifacts 3

build-sandboxes.yml

on: pull_request
Detect changed sandboxes
6s
Detect changed sandboxes
Build base
0s
Build base
Matrix: build
Publish to ECR
0s
Publish to ECR
Fit to window
Zoom out
Zoom in

Artifacts

Produced during runtime
Name Size Digest
NVIDIA~NemoClaw-Community~P67F4I.dockerbuild
35.8 KB
sha256:6cd3dcc8fdb40cb216bbbd3299876357d4585a8463b9318c81a9bc73cf3aecd6
NVIDIA~NemoClaw-Community~W2ETGB.dockerbuild
37.9 KB
sha256:16092ee32f4968e4520bb8cc83fa399d32555ee5e9d15a042d92b7d28d49ef9d
NVIDIA~NemoClaw-Community~YXMHE9.dockerbuild
46.5 KB
sha256:a2db44d3ea4a3dec0229b79c8c26cf816feb63e9659f5adf17aa07b3cbd8aa82