Skip to content

提示显示在加载模型时又遇到了 .to 方法不支持 4-bit 或 8-bit bitsandbytes 模型的问题。 #15

@linwankeji

Description

@linwankeji
    git config --global --add safe.directory E:/SD/ComfyUI/ComfyUI

Failed to get ComfyUI version: Command '['git', 'describe', '--tags']' returned non-zero exit status 128.
got prompt
vision_config is None, using default vision config
Unused kwargs: ['_load_in_4bit', '_load_in_8bit', 'quant_method']. These kwargs are not used in <class 'transformers.utils.quantization_config.BitsAndBytesConfig'>.
low_cpu_mem_usage was None, now set to True since model is quantized.
Loading checkpoint shards: 100%|█████████████████████████████████████████████████████████| 2/2 [00:09<00:00, 4.52s/it]
!!! Exception during processing !!! .to is not supported for 4-bit or 8-bit bitsandbytes models. Please use the model as it is, since the model has already been set to the correct devices and casted to the correct dtype.
Traceback (most recent call last):
File "E:\SD\ComfyUI\ComfyUI\execution.py", line 324, in execute
output_data, output_ui, has_subgraph = get_output_data(obj, input_data_all, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions