how to use T-MAC in model #11
Unanswered
eightfeetfish
asked this question in
Q&A
Replies: 2 comments
-
|
llama.cpp and whisper.cpp are both built upon ggml. However, their code is evolving too fast, with a lot of refactoring. It seems the best option is to first apply our integration onto the latest ggml. |
Beta Was this translation helpful? Give feedback.
0 replies
-
|
I don't understand why your PR in the upstream llama.cpp is still suspending? Is it because that branch has conflicts with upstream? |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
i want to use T-MAC to accerate openAI Whisper model inferrence,anyone have a good idea?
Thanks
Beta Was this translation helpful? Give feedback.
All reactions