Skip to content

FIX: Unpin Xformers to fix Linux install#20

Open
ThomasEricB wants to merge 2 commits intopinokiofactory:mainfrom
ThomasEricB:Linux-Fix
Open

FIX: Unpin Xformers to fix Linux install#20
ThomasEricB wants to merge 2 commits intopinokiofactory:mainfrom
ThomasEricB:Linux-Fix

Conversation

@ThomasEricB
Copy link
Copy Markdown

@ThomasEricB ThomasEricB commented Feb 13, 2026

Simple fix that removes the Xformers requirement.

Screenshot_20260213_001016

Error:

<<PINOKIO_SHELL>>eval "$(conda shell.bash hook)" ; conda deactivate ; conda deactivate ; conda deactivate ; conda activate base ; source /home/dragon/pinokio/api/wan.git/app/env/bin/activate /home/dragon/pinokio/api/wan.git/app/env && python wgp.py --multiple-images 
Traceback (most recent call last):
  File "/home/dragon/pinokio/api/wan.git/app/env/lib/python3.10/site-packages/diffusers/utils/import_utils.py", line 1016, in _get_module
    return importlib.import_module("." + module_name, self.__name__)
  File "/home/dragon/pinokio/bin/miniconda/lib/python3.10/importlib/__init__.py", line 126, in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
  File "<frozen importlib._bootstrap>", line 1050, in _gcd_import
  File "<frozen importlib._bootstrap>", line 1027, in _find_and_load
  File "<frozen importlib._bootstrap>", line 1006, in _find_and_load_unlocked
  File "<frozen importlib._bootstrap>", line 688, in _load_unlocked
  File "<frozen importlib._bootstrap_external>", line 883, in exec_module
  File "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed
  File "/home/dragon/pinokio/api/wan.git/app/env/lib/python3.10/site-packages/diffusers/loaders/peft.py", line 25, in <module>
    from ..hooks.group_offloading import _maybe_remove_and_reapply_group_offloading
  File "/home/dragon/pinokio/api/wan.git/app/env/lib/python3.10/site-packages/diffusers/hooks/__init__.py", line 20, in <module>
    from .faster_cache import FasterCacheConfig, apply_faster_cache
  File "/home/dragon/pinokio/api/wan.git/app/env/lib/python3.10/site-packages/diffusers/hooks/faster_cache.py", line 21, in <module>
    from ..models.attention import AttentionModuleMixin
  File "/home/dragon/pinokio/api/wan.git/app/env/lib/python3.10/site-packages/diffusers/models/attention.py", line 25, in <module>
    from .attention_processor import Attention, AttentionProcessor, JointAttnProcessor2_0
  File "/home/dragon/pinokio/api/wan.git/app/env/lib/python3.10/site-packages/diffusers/models/attention_processor.py", line 35, in <module>
    import xformers.ops
  File "/home/dragon/pinokio/api/wan.git/app/env/lib/python3.10/site-packages/xformers/ops/__init__.py", line 9, in <module>
    from .fmha import (
  File "/home/dragon/pinokio/api/wan.git/app/env/lib/python3.10/site-packages/xformers/ops/fmha/__init__.py", line 10, in <module>
    from . import (
  File "/home/dragon/pinokio/api/wan.git/app/env/lib/python3.10/site-packages/xformers/ops/fmha/flash.py", line 78, in <module>
    raise ImportError(
ImportError: Requires Flash-Attention version >=2.7.1,<=2.7.4 but got 2.8.3.

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/home/dragon/pinokio/api/wan.git/app/env/lib/python3.10/site-packages/diffusers/utils/import_utils.py", line 1016, in _get_module
    return importlib.import_module("." + module_name, self.__name__)
  File "/home/dragon/pinokio/bin/miniconda/lib/python3.10/importlib/__init__.py", line 126, in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
  File "<frozen importlib._bootstrap>", line 1050, in _gcd_import
  File "<frozen importlib._bootstrap>", line 1027, in _find_and_load
  File "<frozen importlib._bootstrap>", line 992, in _find_and_load_unlocked
  File "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed
  File "<frozen importlib._bootstrap>", line 1050, in _gcd_import
  File "<frozen importlib._bootstrap>", line 1027, in _find_and_load
  File "<frozen importlib._bootstrap>", line 1006, in _find_and_load_unlocked
  File "<frozen importlib._bootstrap>", line 688, in _load_unlocked
  File "<frozen importlib._bootstrap_external>", line 883, in exec_module
  File "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed
  File "/home/dragon/pinokio/api/wan.git/app/env/lib/python3.10/site-packages/diffusers/models/transformers/__init__.py", line 5, in <module>
    from .auraflow_transformer_2d import AuraFlowTransformer2DModel
  File "/home/dragon/pinokio/api/wan.git/app/env/lib/python3.10/site-packages/diffusers/models/transformers/auraflow_transformer_2d.py", line 23, in <module>
    from ...loaders import FromOriginalModelMixin, PeftAdapterMixin
  File "<frozen importlib._bootstrap>", line 1075, in _handle_fromlist
  File "/home/dragon/pinokio/api/wan.git/app/env/lib/python3.10/site-packages/diffusers/utils/import_utils.py", line 1006, in __getattr__
    module = self._get_module(self._class_to_module[name])
  File "/home/dragon/pinokio/api/wan.git/app/env/lib/python3.10/site-packages/diffusers/utils/import_utils.py", line 1018, in _get_module
    raise RuntimeError(
RuntimeError: Failed to import diffusers.loaders.peft because of the following error (look up to see its traceback):
Requires Flash-Attention version >=2.7.1,<=2.7.4 but got 2.8.3.

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/home/dragon/pinokio/api/wan.git/app/wgp.py", line 25, in <module>
    from mmgp import offload, safetensors2, profile_type , quant_router
  File "/home/dragon/pinokio/api/wan.git/app/env/lib/python3.10/site-packages/mmgp/offload.py", line 74, in <module>
    from .quant_router import (
  File "/home/dragon/pinokio/api/wan.git/app/env/lib/python3.10/site-packages/mmgp/quant_router.py", line 7, in <module>
    from optimum.quanto import QModuleMixin, register_qmodule
  File "/home/dragon/pinokio/api/wan.git/app/env/lib/python3.10/site-packages/optimum/quanto/__init__.py", line 19, in <module>
    from .models import *
  File "/home/dragon/pinokio/api/wan.git/app/env/lib/python3.10/site-packages/optimum/quanto/models/__init__.py", line 34, in <module>
    from .diffusers_models import *
  File "/home/dragon/pinokio/api/wan.git/app/env/lib/python3.10/site-packages/optimum/quanto/models/diffusers_models.py", line 30, in <module>
    from diffusers import PixArtTransformer2DModel
  File "<frozen importlib._bootstrap>", line 1075, in _handle_fromlist
  File "/home/dragon/pinokio/api/wan.git/app/env/lib/python3.10/site-packages/diffusers/utils/import_utils.py", line 1007, in __getattr__
    value = getattr(module, name)
  File "/home/dragon/pinokio/api/wan.git/app/env/lib/python3.10/site-packages/diffusers/utils/import_utils.py", line 1006, in __getattr__
    module = self._get_module(self._class_to_module[name])
  File "/home/dragon/pinokio/api/wan.git/app/env/lib/python3.10/site-packages/diffusers/utils/import_utils.py", line 1018, in _get_module
    raise RuntimeError(
RuntimeError: Failed to import diffusers.models.transformers.pixart_transformer_2d because of the following error (look up to see its traceback):
Failed to import diffusers.loaders.peft because of the following error (look up to see its traceback):
Requires Flash-Attention version >=2.7.1,<=2.7.4 but got 2.8.3.
(base) <<PINOKIO_SHELL>>

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant