fix(chat ui): fixed llm inference chat with openclaw to work with new LiteLLM gateway #47
build-sandboxes.yml
on: pull_request
Artifacts
Produced during runtime
| Name | Size | Digest | |
|---|---|---|---|
|
NVIDIA~NemoClaw-Community~P67F4I.dockerbuild
|
35.8 KB |
sha256:6cd3dcc8fdb40cb216bbbd3299876357d4585a8463b9318c81a9bc73cf3aecd6
|
|
|
NVIDIA~NemoClaw-Community~W2ETGB.dockerbuild
|
37.9 KB |
sha256:16092ee32f4968e4520bb8cc83fa399d32555ee5e9d15a042d92b7d28d49ef9d
|
|
|
NVIDIA~NemoClaw-Community~YXMHE9.dockerbuild
|
46.5 KB |
sha256:a2db44d3ea4a3dec0229b79c8c26cf816feb63e9659f5adf17aa07b3cbd8aa82
|
|