Could you please tell me how much VRAM

#4
by allen666 - opened

Could you please tell me how much VRAM is required locally for this application to run normally? Thank you

1024×1024, 8 steps, bfloat16
➡12-16GB or more required.
If it is less than 12GB, the image size should be less than 512x512.

16GB VRAM (e.g., RTX 3090, A6000, 4080, 4090, A100 40GB)
12GB VRAM (very close to the limit) (e.g., RTX 3060 12GB, 4070Ti)

optimization

optimize_pipeline_(pipe, image=Image.new("RGB", (1024, 1024)), prompt="prompt")
👇
optimize_pipeline_(pipe, image=Image.new("RGB", (512, 512)), prompt="prompt")

add this:

pipe.enable_xformers_memory_efficient_attention()

If you'll use NVIDIA GPU:

bfloat16
👇
float16

optimization

optimize_pipeline_(pipe, image=Image.new("RGB", (1024, 1024)), prompt="prompt")
👇
optimize_pipeline_(pipe, image=Image.new("RGB", (512, 512)), prompt="prompt")

add this:

pipe.enable_xformers_memory_efficient_attention()

If you'll use NVIDIA GPU:

bfloat16
👇
float16

I still meet "CUDA out of memory" issue when using these modifications with my NVIDIA L40S GPU (48GB)

add

if hasattr(pipe, "enable_attention_slicing"):
    pipe.enable_attention_slicing()
import torch
torch.cuda.empty_cache()
print(torch.cuda.mem_get_info())  # torch 2.2+ 
if hasattr(pipe, "enable_xformers_memory_efficient_attention"):
    try:
        pipe.enable_xformers_memory_efficient_attention()
        print("xformers memory efficient attention enabled")
    except Exception as e:
        print("Failed to enable xformers:", e)
else:
    print("xformers not available on this pipeline / not installed")

I don't know any more

Sign up or log in to comment