Skip to content
This repository was archived by the owner on Mar 14, 2025. It is now read-only.

Is the inference for int8 the same as for fp16? #36

Open
WEIZHIHONG720 opened this issue Mar 30, 2022 · 0 comments
Open

Is the inference for int8 the same as for fp16? #36

WEIZHIHONG720 opened this issue Mar 30, 2022 · 0 comments

Comments

@WEIZHIHONG720
Copy link

Hello, thank you for your work!

dd5c2b6110f7bacd7cff148f5db0ca0

Do I have a problem with int8-engine inference this way? I loaded the engine file directly, but didn't use the calibration table, I'm a bit y

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant