chore: update client package references from llama-stack-client to ogx-client#5635
Draft
cdoern wants to merge 2 commits intoogx-ai:mainfrom
Draft
chore: update client package references from llama-stack-client to ogx-client#5635cdoern wants to merge 2 commits intoogx-ai:mainfrom
cdoern wants to merge 2 commits intoogx-ai:mainfrom
Conversation
…x-client Update package dependencies and workflow tests to use the renamed ogx-client package instead of llama-stack-client. Changes: - Update pyproject.toml client and type_checking dependencies to ogx-client>=0.7.1 - Update Python import test to use ogx_client module - Update npm package test to use ogx-client package name Note: This PR is blocked on Stainless regenerating the client SDKs with the new package names. CI may fail until ogx-client is published. Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com> Signed-off-by: Charlie Doern <cdoern@redhat.com>
Update all references from llama-stack-client to ogx-client: - Python module: llama_stack_client → ogx_client - PyPI package: llama-stack-client → ogx-client - npm package: llama-stack-client → ogx-client - Repository URLs: meta-llama/llama-stack-client-* → ogx-ai/ogx-client-* Files updated: - All Python source files (src/ogx/core/library_client.py, src/ogx/testing/api_recorder.py) - All test files (tests/integration/**/*) - All documentation (docs/**/*.md, docs/**/*.mdx) - README.md, CONTRIBUTING.md - Workflow file (.github/workflows/pypi.yml) - pyproject.toml dependencies Note: uv.lock cannot be updated yet because ogx-client packages don't exist on PyPI/npm. Lockfile will be regenerated after Stainless publishes the new clients. Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com> Signed-off-by: Charlie Doern <cdoern@redhat.com>
Collaborator
Author
|
for this to merge. we need:
This workflow as is can be used to publish these packages, but we need to update the client's to actually be |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Summary
Updates package dependencies and workflow tests to use the renamed
ogx-clientpackage instead ofllama-stack-client.Changes
clientandtype_checkingdependencies toogx-client>=0.7.1ogx_clientmoduleogx-clientpackage nameContext
Part of the rename from
llama-stacktoogx. The Stainless config inclient-sdks/stainless/config.ymlis already updated to generate clients with the new names:ogx-client(module:ogx_client)ogx-clientBlockers
ogx-ai/ogx-client-pythonogx-ai/ogx-client-typescriptCI may fail until
ogx-clientis published to PyPI and npm.Related
client-sdks/stainless/config.yml(lines 16-27).github/workflows/stainless-builds.yml(lines 47, 50)