Topic: generating feral art on pc

Posted under General

hwy as the title says i want to generate feral ort and it can be on my pc seeing that most websites block feral stuff i have quite a good pc so i dont think i would have a problem generating stuff but im very new to this so i have no idea where i would start to generate stuff on my own machine could you guys please help me

alecma said:
hwy as the title says i want to generate feral ort and it can be on my pc seeing that most websites block feral stuff i have quite a good pc so i dont think i would have a problem generating stuff but im very new to this so i have no idea where i would start to generate stuff on my own machine could you guys please help me

I strongly suggest you join the Furry Diffusion Discord server. This forum is not ideal for getting help.
The invite link is here on the site in the "Discord" tab.
But if you really want support here in the forum the first thing you would need to say is which video card you have. Otherwise it's not possible to help.

silvicultor said:
I strongly suggest you join the Furry Diffusion Discord server. This forum is not ideal for getting help.
The invite link is here on the site in the "Discord" tab.
But if you really want support here in the forum the first thing you would need to say is which video card you have. Otherwise it's not possible to help.

thank you i will try the discord but also my gpu is a 3070TI it has 8gb of vram and im thinking at the end of this year upgrading to something with 16gb vram b it a 5070ti or a 9070 xt

alecma said:
thank you i will try the discord but also my gpu is a 3070TI it has 8gb of vram and im thinking at the end of this year upgrading to something with 16gb vram b it a 5070ti or a 9070 xt

Current GPU is OK but not ideal. But don't buy AMD, never ever for AI! No 9070 xt!
I know NVIDIA is overpriced but there is no viable replacement for NVIDIA's CUDA for AI.
Since you already have a NVIDIA GPU that can run SD I can recommend you this guide: https://rentry.co/fluffscaler-install

Updated

pilk

Member

silvicultor said:
Current GPU is OK but not ideal. But don't buy AMD, never ever for AI! No 9070 xt!
I know NVIDIA is overpriced but there is no viable replacement for NVIDIA's CUDA for AI.
Since you already have a NVIDIA GPU that can run SD I can recommend you this guide: https://rentry.co/fluffscaler-install

AMD has ZLUDA as a drop-in replacement for CUDA, it gens just fine.

pilk said:
AMD has ZLUDA as a drop-in replacement for CUDA, it gens just fine.

It's slower, there's no reason to buy AMD GPUs for AI, unless it's some kind of AI MAX PRO with 128gb of shared memory.

pilk

Member

ayokeito said:
It's slower, there's no reason to buy AMD GPUs for AI, unless it's some kind of AI MAX PRO with 128gb of shared memory.

I agree. NVIDIA is the better option for AI.

I'm just adding AMD is also viable if you happen to have it for any reason, you're not SOL and can still use all of the usual CUDA frameworks.
For context, on a 7900 XTX, it takes about a minute to do NoobAI with 25 Euler iterations at 2048x2048 base resolution or about the same time for 20 iterations at 1200x800 + hires fix. You don't even need all that for good results, I sketch with 10-15 and it takes seconds to see the changes.

tldr - don't be discouraged, anyone can gen

Updated

pilk said:
I agree. NVIDIA is the better option for AI.

I'm just adding AMD is also viable if you happen to have it for any reason, you're not SOL and can still use all of the usual CUDA frameworks.
For context, on a 7900 XTX, it takes about a minute to do NoobAI with 25 Euler iterations at 2048x2048 base resolution or about the same time for 20 iterations at 1200x800 + hires fix. You don't even need all that for good results, I sketch with 10-15 and it takes seconds to see the changes.

tldr - don't be discouraged, anyone can gen

I know that it is possible to gen with AMD. But it is not recommended, still.
It just doesn't make sense to buy AMD with you wanna use it for AI. My 5060ti has cost only half the amount of money a 7900 XTX would. And it will gen an SDXL image at base res in a few seconds.
And we are only talking about most basic inference of SDXL here an old model with low requirements.
Can your run Wan14b-720p model with your GPU? Or what about Flux? You will have big trouble with that. While I can all do that without issues with my much cheaper video card.
Did you ever train LoRA with your GPU? Despite the high VRAM it will most likely go OOM. AMD memory management is awful.
AMD might appear cheaper at first glance, for seemingly better specs, but it's a trap. A GB of AMD VRAM is only worth half a GB of NVIDIA VRAM (tho that is not actually related to the VRAM itself).

So my advice stays the same: Don't buy AMD for AI.

Updated

silvicultor said:
I know that is is possible to gen with AMD. But it is not recommended, still.
It just doesn't make sense to buy AMD with you wanna use it for AI. My 5060ti has cost only half the amount of money a 7900 XTX would. And it will gen an SDXL image at base res in a few seconds.
And we are only talking about most basic inference of SDXL here an old model with low requirements.
Can your run Wan14b-720p model with your GPU? Or what about Flux? You will have big trouble with that. While I can all do that without issues with my much cheaper video card.
Did you ever train LoRA with your GPU? Despite the high VRAM it will most likely go OOM. AMD memory management is awful.
AMD might appear cheaper at first glance, for seemingly better specs, but it's a trap. A GB of AMD VRAM is only worth half a GB of NVIDIA VRAM (tho that is not actually related to the VRAM itself).

So my advice stays the same: Don't buy AMD for AI.

On a side note, it seems like 768p minimum size is now enforced at upload level, so there's no such thing as too much VRAM for making videos :^)

b1techienne said:
On a side note, it seems like 768p minimum size is now enforced at upload level, so there's no such thing as too much VRAM for making videos :^)

Huh? The minimum resolution thing was always about images. Videos are somewhat judged outside the normal quality guidelines afaik.
I see recently approved videos smaller than 768, for example: https://e6ai.net/posts/120641
But regardless "too much VRAM" never existed when it comes to AI. It was always "not enough".

silvicultor said:
Huh? The minimum resolution thing was always about images. Videos are somewhat judged outside the normal quality guidelines afaik.
I see recently approved videos smaller than 768, for example: https://e6ai.net/posts/120641
But regardless "too much VRAM" never existed when it comes to AI. It was always "not enough".

It happened very recently, maybe in the last 3 days, contributions with <768 sizes are prevented from uploading at all. A bit annoying, since video models such as WAN are recommending 720p, and the yield rate was already low enough with recommended resolutions

b1techienne said:
It happened very recently, maybe in the last 3 days, contributions with <768 sizes are prevented from uploading at all. A bit annoying, since video models such as WAN are recommending 720p, and the yield rate was already low enough with recommended resolutions

Yes, and many currently cannot locally generate 480p, let alone the much more taxing 720p model (which still isn't large enough for the now 768 requirement). You're forced now to upscale the video, which not always turns out well and introduces artifacting. Feel like they should've waited a few months for this change until the tech improves and higher resolutions are more feasible with current hardware.

ocillet said:
Yes, and many currently cannot locally generate 480p, let alone the much more taxing 720p model (which still isn't large enough for the now 768 requirement). You're forced now to upscale the video, which not always turns out well and introduces artifacting. Feel like they should've waited a few months for this change until the tech improves and higher resolutions are more feasible with current hardware.

I'm currently trying to generate 1024x768 videos without upscale with WAN (which seemed to only recommend 832x480 and 1280x720 resolutions), and results seems less disastrous than it seemed, but the rules about resolution and quality standards are a great hindrance to the community's effort to take video generation back from the hands of paid, online, censored services (x_x)