Skip to content

Commit 5a49010

Browse files
authored
Update readme (#137)
* add known issue * update mineru usage * update_readme
1 parent fbd8072 commit 5a49010

File tree

2 files changed

+7
-6
lines changed

2 files changed

+7
-6
lines changed

vllm/KNOWN_ISSUES.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -12,9 +12,9 @@ Workaround: Change the PCIe slot configuration in BIOS from Auto/x16 to x8/x8.
1212
With this change, over 40 GB/s bi-directional P2P bandwidth can be achieved.
1313
Root cause analysis is still in progress.
1414

15-
# 03. Container OOM killed by using `--enable-auto-tool-choice` and starting container not by /bin/bash and not run `source /opt/intel/oneapi/setvars.sh`
15+
# 03. Container OOM killed (and vllm performance drop) when starting container not by /bin/bash and not run `source /opt/intel/oneapi/setvars.sh`
1616

17-
When using `--enable-auto-tool-choice` and deploy container by docker-compose without `source /opt/intel/oneapi/setvars.sh`, the LD_LIBRARY_PATH will be different and cause the container OOM. It can be reproduced by this two command:
17+
When using `--enable-auto-tool-choice` and deploy container by docker-compose without `source /opt/intel/oneapi/setvars.sh`, the LD_LIBRARY_PATH will be different and cause the container OOM (or performance drop). It can be reproduced by this two command:
1818

1919
```bash
2020
docker run --rm --entrypoint "/bin/bash" --name=test intel/llm-scaler-vllm:latest -c env | grep LD_LIBRARY_PATH

vllm/README.md

Lines changed: 5 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -2177,6 +2177,8 @@ curl http://localhost:8000/v1/chat/completions \
21772177
"max_tokens": 128
21782178
}'
21792179
```
2180+
2181+
if want to process image in server local, you can `"url": "file:/llm/models/test/1.jpg"` to test.
21802182
---
21812183

21822184
### 2.4.1 Audio Model Support [Deprecated]
@@ -2319,12 +2321,11 @@ python3 -m vllm.entrypoints.openai.api_server \
23192321

23202322

23212323
#### Run the demo
2322-
To verify your setup, clone the official MinerU repository and run the demo script:
2324+
To verify mineru
23232325

23242326
```bash
2325-
git clone https://github.com/opendatalab/MinerU.git
2326-
cd MinerU/demo
2327-
python3 demo.py
2327+
#mineru -p <input_path> -o <output_path> -b vlm-http-client -u http://127.0.0.1:8000
2328+
mineru -p /llm/MinerU/demo/pdfs/small_ocr.pdf -o ./ -b vlm-http-client -u http://127.0.0.1:8000
23282329
```
23292330

23302331
---

0 commit comments

Comments
 (0)