-
Notifications
You must be signed in to change notification settings - Fork 10.5k
Open
Description
I have 6GB VRAM, and here is what I got running txt2img:
RuntimeError: CUDA out of memory. Tried to allocate 1.50 GiB (GPU 0; 5.80 GiB total capacity; 4.12 GiB already allocated; 682.94 MiB free; 4.24 GiB reserved in total by PyTorch) If reserved memory is >> allocated memory try setting max_split_size_mb to avoid fragmentation. See documentation for Memory Management and PYTORCH_CUDA_ALLOC_CONF
with only options --prompt "tower" and --plms
Maybe I can reduce quality or something?
Metadata
Metadata
Assignees
Labels
No labels