Skip to content

Conversation

@iysheng
Copy link

@iysheng iysheng commented Dec 5, 2023

After I build tvm from souce code with guide.Then I do some test on pc with command as:

▸ python3.8 ~/Projects/ai_light_brain/npu_releate/tvm/thead/hhb_cli.py -S --model--S --model-file mobilenetv2-12.onnx --data-scale 0.017 --data-mean "124 117 104" --board x86
_ref --input-name "input" --output-name "output" --input-shape "1 3 224 224" --postprocess save_and_top5 --simulate-data persian_cat.jpg --quantization-scheme float16
[2023-12-05 21:01:42] (HHB LOG): Start import model.
Traceback (most recent call last):
  File "/home/red/Projects/ai_light_brain/npu_releate/tvm/thead/hhb_cli.py", line 21, in <module>
    main()
  File "/home/red/Projects/ai_light_brain/npu_releate/tvm/thead/hhb/main.py", line 559, in main
    sys.exit(_main(argv))
  File "/home/red/Projects/ai_light_brain/npu_releate/tvm/thead/hhb/main.py", line 553, in _main
    arg_manage.run_command(args_filter, curr_command_type)
  File "/home/red/Projects/ai_light_brain/npu_releate/tvm/thead/hhb/core/arguments_manage.py", line 1515, in run_command
    return args_filter.filtered_args.func(args_filter)
  File "/home/red/Projects/ai_light_brain/npu_releate/tvm/thead/hhb/core/main_command_manage.py", line 292, in driver_main_command
    mod, params = import_model(
  File "/home/red/Projects/ai_light_brain/npu_releate/tvm/thead/hhb/core/frontend_manage.py", line 417, in import_model
    mod, params = frontend.load(path, input_name, input_shape, output_name)
  File "/home/red/Projects/ai_light_brain/npu_releate/tvm/thead/hhb/core/frontend_manage.py", line 114, in load
    e = onnx.onnx.utils.Extractor(onnx_model)
AttributeError: module 'onnx' has no attribute 'onnx'

This pr just fix with this issue.

@CLAassistant
Copy link

CLAassistant commented Dec 5, 2023

CLA assistant check
All committers have signed the CLA.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants