-
Notifications
You must be signed in to change notification settings - Fork 2
Open
Description
Stage 3: TensorRT Engine Conversion/Inference
- Investigate using TensorRT Engine Plugins to bake the pre/post processing into the engine file itself (with amy)
- Repair the comparison scripts under /python_wip and /conversion_tools and report on results of different methods
- For all the verify functions run them for a few random inputs and average values (warmup)
- Copy logic for comparing the prediction consistency and confidence independently using valery's functions in ONNX_verify
- MNIST Inference Example
- QuickStart Guide
- Take a look at the Google Drive for Nvidia Examples
Specific Engine Plugins/Extras for Viewing in Drive:
- crop and resize
- efficientNMS
- flattenConcat
- NMS plugin
- nvpluginfasterRCNN
- resizeNearest
- polygraphy
Metadata
Metadata
Assignees
Labels
No labels