You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I have no issues with Flash Attention on my comfy install. It loads and works fine with other things but I'm getting an error stating: Flash Attention failed, using default SDPA when the frames are being generated. Does anyone have any solution for this? Thank you.
`
Checkpoint files will always be loaded safely.
Total VRAM 24576 MB, total RAM 32457 MB
pytorch version: 2.6.0+cu126
Set vram state to: NORMAL_VRAM
Device: cuda:0 NVIDIA GeForce RTX 3090 : cudaMallocAsync
Using Flash Attention
Python version: 3.12.7 (tags/v3.12.7:0b05ead, Oct 1 2024, 03:06:41) [MSC v.1941 64 bit (AMD64)]
ComfyUI version: 0.3.29
ComfyUI frontend version: 1.17.11
The text was updated successfully, but these errors were encountered:
I have no issues with Flash Attention on my comfy install. It loads and works fine with other things but I'm getting an error stating:
Flash Attention failed, using default SDPA
when the frames are being generated. Does anyone have any solution for this? Thank you.`
The text was updated successfully, but these errors were encountered: