Skip to content

Update vLLM config#74

Open
sampagon wants to merge 2 commits intobytedance:mainfrom
sampagon:update-vllm-config
Open

Update vLLM config#74
sampagon wants to merge 2 commits intobytedance:mainfrom
sampagon:update-vllm-config

Conversation

@sampagon
Copy link

@sampagon sampagon commented Mar 1, 2025

When setting up UI-Tars using the existing vLLM config I ran into issues with some of the flags, specifically --limit-mm-per-prompt. I fixed these issues by using the vllm serve command. Considering vLLM documentation says to use vllm serve for spinning up an OpenAI compatible server, I thought going ahead and making this change to the README would be helpful to others.

@CLAassistant
Copy link

CLAassistant commented Mar 1, 2025

CLA assistant check
All committers have signed the CLA.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants