-
Notifications
You must be signed in to change notification settings - Fork 39
Open
Description
Hello! Firstly, thanks for the amazing work on enabling WebAssembly for ML 🤗!
I'm fairly new to adopting WebAssembly for machine learning, but I'm particularly interested to compile non-neural network type models to WebAssembly for inference. Can I clarify that the following backend support for wasi-nn:
Tensorflow, ONNX, OpenVINO, etc.
specifically for ONNX, imply that wasi-nn should also work with XGBoost and scikit-learn models/pipelines, for example, in this documentation? The idea here is that if I have an existing XGBoost model that I'd like to deploy as inference, I would need to first convert the model to an ONNX format and then write the wasi-nn bindings (AssemblyScript or Rust bindings) to execute the model?
Thanks in advance!
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
No labels