3080 10gb but recently experimented with LLM and it eats up vram. Looking for a dual 3090 or 4090 for 48gb of vram.
When is SDXL releasing for people to train?
Welcome to the Stable Diffusion community, dedicated to the exploration and discussion of the open source deep learning model known as Stable Diffusion.
Introduced in 2022, Stable Diffusion uses a latent diffusion model to generate detailed images based on text descriptions and can also be applied to other tasks such as inpainting, outpainting, and generating image-to-image translations guided by text prompts. The model was developed by the startup Stability AI, in collaboration with a number of academic researchers and non-profit organizations, marking a significant shift from previous proprietary models that were accessible only via cloud services.
3080 10gb but recently experimented with LLM and it eats up vram. Looking for a dual 3090 or 4090 for 48gb of vram.
When is SDXL releasing for people to train?
NVIDIA is going to be faster and easier to be compatible with SD.
VRAM is going to be your friend especially if you start working with Deforum and video
Hum not a big fan of running Nvidia on Linux... What's a minimum number 12g?
Why not?
I'm currently running an NVIDIA GPU on Debian with SD and I haven't had any issues.
More their attitude to the FOSS community. They are kinda like apple where they don't really contribute back.