I'd love to hear some informed opinions.
My rig is five years old now. I still have an rtx 2070 Super with 8GB of VRAM. It's not ideal for SD, but it could be worse.
I've also been experimenting with locally run LLMs/chatbots recently for the first time. I am still having a blast with all the possibilities, but it's obvious that my 8GBs are hopelessly obsolete for LLMs and their full potential.
I was gonna wait for the next rtx generation to be released and then maybe get one of the 4000 series when prices drop, but it looks like that may be another year from now and I'm getting itchy fingers...
Even if I got a new PC with a 16 GB card now, it would basically be insta-obsolete for LLMs and maybe other AI-driven applications, too - and even more so in another year or two.
I've got this feeling that high-VRAM will become more common and more affordable in the years to come, now that the demand is growing with AI developments.
So... Apart from having itchy fingers, it does seem like a bad time to spend a lot of money. Maybe there is an upgradable, future-proof solution? I'm not so up-to-date with current hardware and developments.
My budget would be something between 1500 and 2000€ for the whole system.
Thanks for all replies. The more, the better.
Updated