Latest development release. Most HF Transformers tests work with this now. #14
unixwzrd
announced in
Announcements
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
This iis ready to use, but you may experience some issues we have left to address. Look in th test_automation/reports for a summary of the complete HuggingFace Transformers library regression test results.
Go ahead and fix one or two if you like or integrate the bridge into your CUDA code by simply doing:
import TorchDevice
Assuming you have Torch Device installed. Colen the rep, cd into the repo directory and do a
pip install -e .
That's it you can now use CUDA code on Apple Silicon with MPS.You may want to use the compile script for NumPy which links it to the accelerate framework as well, that will give about 8=10 improvements too.
This discussion was created from the release Latest development release. Most HF Transformers tests work with this now..
Beta Was this translation helpful? Give feedback.
All reactions