Skip to content

Model Optimization Tasks: Stage 3 #39

@Ishaan-Datta

Description

@Ishaan-Datta

Stage 3: TensorRT Engine Conversion/Inference

  • Investigate using TensorRT Engine Plugins to bake the pre/post processing into the engine file itself (with amy)
  • Repair the comparison scripts under /python_wip and /conversion_tools and report on results of different methods
  • For all the verify functions run them for a few random inputs and average values (warmup)
  • Copy logic for comparing the prediction consistency and confidence independently using valery's functions in ONNX_verify
  • MNIST Inference Example
  • QuickStart Guide
  • Take a look at the Google Drive for Nvidia Examples

Specific Engine Plugins/Extras for Viewing in Drive:

  • crop and resize
  • efficientNMS
  • flattenConcat
  • NMS plugin
  • nvpluginfasterRCNN
  • resizeNearest
  • polygraphy

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions