• tal
    link
    fedilink
    English
    arrow-up
    2
    ·
    edit-2
    1 year ago

    Linux is not required, certainly not if you’re running Nvidia hardware.

    VRAM is going to be very tight. It looks like an RTX 2080 has 8GB. Stable Diffusion is very demanding of VRAM.

    https://www.tomshardware.com/news/stable-diffusion-gpu-benchmarks

    These are benchmarks from early in the year. They’re running an older version of Stable Diffusion on a Windows machine. A RTX 2080 isn’t blistering fast – they put it at the top of the “legacy GPU box” – but it does run it.

    Note that they’re using an older Stable Diffusion version and model that’s trained on 512x512 images and the current crop of models are trained on 1024x1024 images. It’s generally preferable to generate images at the size that the model is trained at, and larger models consume more VRAM.

    Video memory is a limiting factor in the size of the image that you can generate.

    googles

    https://www.reddit.com/r/StableDiffusion/comments/wtd4e1/if_i_want_to_generate_large_images_say_1024x1024/

    Look further on this subreddit, but from this, it sounds like people cap out before 1024x1x024 at 8GB.

    It should be possible to use an older Stable Diffusion 1.5 model trained on 512x512 images to generate at 512x512 images, which is what was the norm until earlier this year when the larger SDXL models came out. That’s half the resolution in each dimension of the images I’ve been posting. The lower-resolution models are definitely usable, but they have more trouble with fingers and toes and…well, you can look at images generated from SD 1.5 models on civitai.com. You can then upscale the output a chunk at a time, so it’s possible to wind up with high-resolution output images.

    I don’t know if it’s possible to generate lower-resolution output using an SDXL model on 8GB.

    There are various SD optimizations (lowvram, half precision off the top of my head, but I haven’t been looking at them for a couple months and am our of date) that you’ll probably want to try out, since they’ll let you squeeze more out of the memory.

    Sorry. I know that it’d be nice to give a definitive “this is what card X can do”, but this is all pretty bleeding edge. Two years ago, none of this was possible.