Skip to content

Inference: onnx2torch crashes with models using same padding #1028

@trivoldus28

Description

@trivoldus28

Severity is high due to same padding being used in many models. However, it's possible to side step the problem by using other checkpointing mechanisms.

Error is something like this

"/home/tri/zetta_local/venv/python312/lib/python3.12/site-packages/onnx2torch/converter.py",

line 110, in convert

torch_module, onnx_mapping = converter(onnx_node, onnx_graph)

^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

File

"/home/tri/zetta_local/venv/python312/lib/python3.12/site-packages/onnx2torch/node_converters/c

onv.py", line 48, in _

padding, input_padding_module = onnx_auto_pad_to_torch_padding(

^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

File

"/home/tri/zetta_local/venv/python312/lib/python3.12/site-packages/onnx2torch/utils/padding.py"

, line 33, in onnx_auto_pad_to_torch_padding

raise NotImplementedError(f'"{auto_pad}" auto_pad is not implemented')

NotImplementedError: "SAME_UPPER" auto_pad is not implemented

Metadata

Metadata

Assignees

No one assigned

    Type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions