What is the best graphics card for renders?

GuyFreely

Active Member
May 2, 2018
663
2,121
I came here to see if people were talking about the 30 series. I got frustrated by the constant fiddling, auto switching to CPU, scene optimization, etc. If the 3090 can save me from most of that, I may have to plunk down the cash. I really don't have the time to spend on this sort of stuff, so if I can get more done in less time, that's a big deal for me. I avoided the first wave, because I generally do, but maybe I will be in on the next one. The card based denoising is an added bonus. If the card can do it without making things look mushy, I'm in. It's down to final render time and the difference it makes I guess.
 

I'm Not Thea Lundgren!

AKA: TotesNotThea
Donor
Jun 21, 2017
6,583
18,945
I came here to see if people were talking about the 30 series. I got frustrated by the constant fiddling, auto switching to CPU, scene optimization, etc. If the 3090 can save me from most of that, I may have to plunk down the cash. I really don't have the time to spend on this sort of stuff, so if I can get more done in less time, that's a big deal for me. I avoided the first wave, because I generally do, but maybe I will be in on the next one. The card based denoising is an added bonus. If the card can do it without making things look mushy, I'm in. It's down to final render time and the difference it makes I guess.
The 3000 series isn't currently officially supported by Daz (if that's what you use), there is a private beta but not date set for public release. Having said that, by the time there is stock again; Daz will probably be out too.
 

khumak

Engaged Member
Oct 2, 2017
3,629
3,661
Might also be worth waiting to see how Nvidia and/or the board partners resolve the issue of some cards crashing when the boost algorithm kicks in (or maybe it's just when overclocking, not sure). The latest I've heard is that it's probably because some board partners cheaped out on the power delivery system a bit using low end "noisy" capacitors instead of the better ones. Some partners didn't do that, in fact some went the other route and used higher quality components than Nvidia put in the FE edition. Nvidia used a mix of both in the FE and that version doesn't seem to have the issue.

Basically when the issue arises, it's when the cards boost up to around 2Ghz or higher. I have a feeling most of them or maybe Nvidia themselves will just issue a firmware/driver update that scales back the boost algorithm a bit for those cards. Maybe add a -50Mhz offset or something for the cards affected.
 

SAMASH112

.
Donor
Nov 19, 2018
1,035
2,463
Yeah the 30xx series... We will can create renders if they allow us to buy it. Here one waiting to be able to buy... I've never had to pray a store to spend my money ...
 

Madmanator99

Member
May 1, 2018
225
455
From my experience, cuda cores and GPU RAM mean faster/bigger renders to a certain point, the computer CPU and RAM are important as well, including the main drive (SSD preferably).
So if everything else is not a problem, then cuda cores is what you want.

To be clearer, a scene can take over 28gig on the computer RAM/virtual memory (on a 16gig RAM computer setup), but less than 8 gig of the GPU memory (thanks nvidia texture compression), so it will still use the GPU to render. If you have a SSD as main drive then it will use it for virtual memory and it will start rendering fast. If you have the actual RAM for it (32gig or more) then it will start even sooner. But! Once it starts rendering, it's up to your cuda cores. The computer CPU/RAM doesn't matter anymore. Only the cuda cores.

The architecture comes into play at that point, between nvidia graphic cards, so we can't compare cuda cores of a 10x with a 20x with a 30x, it's different at a very basic level (chip level). But if we are to believe what they tell us, each generation is better than the one before it, Yet the 1080 Ti is stronger than many 20x. Including my 2070 Super, so ....

Nevertheless, you don't have to trouble yourself, keep things simple, more cuda cores/higher generation = faster renders.