-
-
Notifications
You must be signed in to change notification settings - Fork 63
Inference is extremely slow #182
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Labels
bug
Something isn't working
Comments
Hi |
遇到的问题一样,占用内存100G,开了40核,对于长度5000字以内的文本进行推理,2条/s |
Any luck on this? |
very slow:2s one instance 105/60933 [03:33<28:29:52, 1.69s/it] |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
I have a large corpus (30M docs) and a pretrained inference-only tomotopy model. I want to find the argmax topic for each doc in the corpus and have found through benchmarking (see script here) that list-based inference is faster (by a factor of ~2) than corpus-based. What I find is that using defaults (on a 40-core machine), inference is expected to take 125 days. This seems extremely slow considering training the model took 3h on a 10M document corpus.
My inference script is as follows:
The text was updated successfully, but these errors were encountered: