## Feature Support for FP16, INT8 models on the CPU / MPS ## Reason Faster inference. ONNX currently does not support FP16 on CPU or MPS: [Issue](https://github.com/microsoft/onnxruntime/issues/22242)