What is the best graphics card for renders?

bcsjkdfjksh

Newbie
Mar 22, 2019
66
21
What is the best graphics card for renders? Specifically on DAZ with an NVIDIA make on. IF you also focus on cost efficiency that would be cool too
 

megaplayboy10k

Well-Known Member
Apr 16, 2018
1,523
2,028
On a budget, probably a 2060 or 2070 Super. Has to have ray-tracing capability.
2080 Super is a great buy, pricey but not ridiculously so like the 2080 Ti.

By the time you're spending thousands on a Titan RTX, you might as well buy a Quadro.

Buy extra RAM too.
 
Jul 1, 2018
57
985
You're best off looking at the Daz3d forums for this kind of information, this thread in particular.



Don't get a Quadro, although great cards for CAD type rendering and insanely massive scenes, its some of the worst dollar to performance you can get if you're going to be just using Daz3D, if you want to focus on cost efficiency you're better off buying multiple 2070 Supers or 2080ti's and an NVlink bridge so you can pool the VRAM. Daz only supports texture VRAM pooling right now with NVlink but soon it'll support geometry too. Blender now supports NVlink which is also a bonus and many more programs either do or will soon too. Absolute best dollar to performance would be the 2070 super which is why I went with it and its the minimum card you need if you want to NVlink 2 of them giving you 16GB of VRAM and 5120 CUDA cores, 2060 supers and below don't have NVlink capability.

The new RTX 30xx series are rumored to be coming out very soon with some saying the 3090 (maybe named something else on release) will be a beast with 24GB VRAM and 5248 CUDA cores for $1400, basically a new Titan, but actual confirmed specs could change at release for all we know.
 

Fliptoynk

Member
Nov 9, 2018
384
324
these high-end vc's mentioned here are really expensive! I'm no expert in 3d, just curious, do you really need ultra-high poly models in your animations? I mean, there could be ways to achieve close-to-realism quality using average joe vc's even with just high-poly face + mid-poly body + 2d background if you use just the right combination of low/med/hi lighting, shaders, filters, ambient occlusion, post processing, anti aliasing, and ray tracing.
 
Jul 1, 2018
57
985
Depends on many things if they're "needed" or not. Lower priced GPU = less VRAM therefore more compositing needed because you can only render so much at once, lower texture and final render image resolution if you want to fit some scenes onto your GPU, slower overall render time which may not be such a bad thing if you're only doing a few images for yourself but its a HUGE thing if you're trying to produce comics, games or animations. Also as you mentioned, a lot more fucking around with settings to try find a compromise instead of having more freedom to spend that time on actually creating and doing what you want.

I was using a GTX 1050 for a while, I'm pretty decent with photoshop and video editing software so I could make any image or animation I wanted as long as I planned correctly for the 4GB VRAM, slower render times etc, but the jump to a 2070 Super was like night and day in every area, it's not just about faster rendering and bigger scenes but a lot of saved time when doing test renders to make sure your lighting is right and everything else is in place correctly, etc which can accumulate to a massive amount of time on a lower end GPU. This saved time alone let me render more which has improved my skills and render quality as I'm spending 90% of my available time on the creative side rather than the 50%+ time I was spending on optimizing or trying workarounds because of my GPU.

So to answer your question, depending on your 3D style (realistic high detail vs Cel-shaded or something similar), volume output (personal hobby vs making a game, comics, animations for paying supporters), quality or fidelity output, how much time you have and knowledge of the 3D programs you're using (if you know what all the settings do and can change on the fly to optimize or not) all counts towards if you "need" it or not. If I was still only making comics I could still get away with the old GTX 1050 card (while pulling my hair out and eventually giving myself a coronary), but now I'm making a game that's going to be packed with animations there's no way I'd do it with the old GTX, it'd just take way too long for the volume I'm doing.
 
  • Like
Reactions: bcsjkdfjksh

Fliptoynk

Member
Nov 9, 2018
384
324
Yeah... and the time pressure of script debugging, revising plot lines (if needed), releasing updates

anyways, have you ever thought of using avatarsdk instead of buying daz3d assets? cost cutting also helps
Imagine liana liberato, look-alike! of course, in your game.. mmmmmmm
 

huliolopez

Newbie
Apr 22, 2019
18
8
I'm using 2070 super. Some rendering engines benefit from RTX and, for example, I don't see much people using AMD cards with V-Ray. But for use with DAZ - just buy what you can afford, couple of minutes in render time isn't much of a difference, when you spend hours to set up a scene.
 

DS23G

Member
Game Developer
Jul 24, 2019
202
758
The new RTX 30xx series are rumored to be coming out very soon with some saying the 3090 (maybe named something else on release) will be a beast with 24GB VRAM and 5248 CUDA cores for $1400, basically a new Titan, but actual confirmed specs could change at release for all we know.
I'm still very sceptical about that supposed 1400 dollar price point for the 3090. I mean, according to some leaks, it may be significantly faster than the Titan RTX, while having the same amount of VRAM. And yet it'll just cost about half of what a Titan costs right now? We're talking about Nvidia here, consumer friendly pricing isn't exactly their strong point.
 

Synx

Member
Jul 30, 2018
488
469
I'm using 2070 super. Some rendering engines benefit from RTX and, for example, I don't see much people using AMD cards with V-Ray. But for use with DAZ - just buy what you can afford, couple of minutes in render time isn't much of a difference, when you spend hours to set up a scene.
Vray is designed by Nvidia and doesn't work with AMD-cards, thats why nobody uses AMD cards for DAZ rendering :p. Rendering on AMD in general isn't amazing even their own render engine (ProRender) apperently works better with a nvidia card.
 
  • Like
Reactions: Domiek
Jul 1, 2018
57
985
I'm still very sceptical about that supposed 1400 dollar price point for the 3090. I mean, according to some leaks, it may be significantly faster than the Titan RTX, while having the same amount of VRAM. And yet it'll just cost about half of what a Titan costs right now? We're talking about Nvidia here, consumer friendly pricing isn't exactly their strong point.
Yeah that's why I said "rumored" and "some say" and "could change at release for all we know" because I'm not convinced either, nor should anyone be until actual release :)
 
  • Like
Reactions: DS23G

DS23G

Member
Game Developer
Jul 24, 2019
202
758
I've read that the pricing for the 3090 is about the same as the RTX Titan but is around 40% faster.
Yeah, that "40% faster" stat is also what I've read, that's why the 1400 bucks price point that's been floating around seems too fantastical. The again, the 3090 seems to be in a really weird place, in that it's kinda taking the place of both the xx80ti and titan series, so in that regard the price probably makes sense. The people in the market for a high-end gaming gpu now have to pay alot more compared to former gen cards, while the people in the market for a titan gpu (which ain't that many, in comparison) probably have gotten lucky for once. Of course, that's all speculation, but I want to believe.
 

I'm Not Thea Lundgren!

AKA: TotesNotThea
Donor
Jun 21, 2017
6,583
18,945
Yeah, that "40% faster" stat is also what I've read, that's why the 1400 bucks price point that's been floating around seems too fantastical. The again, the 3090 seems to be in a really weird place, in that it's kinda taking the place of both the xx80ti and titan series, so in that regard the price probably makes sense. The people in the market for a high-end gaming gpu now have to pay alot more compared to former gen cards, while the people in the market for a titan gpu (which ain't that many, in comparison) probably have gotten lucky for once. Of course, that's all speculation, but I want to believe.
I really don't see them selling the 3090 for what is effectively the same price as the 2080ti, that would be a massive loss for Nvidia; they would have to reduce the price of the existing stocks of Titans; all of the vendors that have paid to sell the Titans would make a massive loss; it's a PR nightmare waiting to happen.
 
  • Like
Reactions: DS23G

DS23G

Member
Game Developer
Jul 24, 2019
202
758
I really don't see them selling the 3090 for what is effectively the same price as the 2080ti, that would be a massive loss for Nvidia; they would have to reduce the price of the existing stocks of Titans; all of the vendors that have paid to sell the Titans would make a massive loss; it's a PR nightmare waiting to happen.
Jesus, I wasn't even aware that the 2080ti is still that expensive. I have a 1080ti, so I never really looked too much into the 2080ti, because it just didn't seem worth it. But again, tells you how fantastical that 1400 pricing would be. I still want to believe, but it just seems very un-Nvidia.
 
Jul 1, 2018
57
985
Even if the 3090 does end up being $1400 I still won't consider it. As i already own a 2070S the sweet spot for me is buying a 2nd one with an Nvlink bridge. That's 16GB VRAM, 5GB more than a 2080ti plus a slightly faster render time and all up even cheaper than a 2080ti. I've only ever gone over the 2070's 8GB a few times in Daz but blender is more of a VRAM hog (but much faster render with E-cycles) so the extra VRAM will be great there. I definitely don't need 24GB VRAM and the 2070S is such a fast card as is for my needs.
 

khumak

Engaged Member
Oct 2, 2017
3,629
3,661
I found both good and bad news in the announcement. The bad news was that they seemed to low ball the VRAM on the 3080. The good news is they basically cut the price on the Titan by $1000. I was hoping for a 3080 with 16GB of VRAM. Might actually spring for the 3090 instead though.

I find that my most demanding renders need 12-16GB of memory so unless the memory compression feature I've heard rumors about for the 30 series both exists and is REALLY good I suspect I'll find myself wanting more memory if I get the 3080.

I have heard rumors about doubled memory variants of the 3070 and 3080 as well but no mention of that today. A 16GB 3070 or a 20GB 3080 would be VERY appealing for rendering IMO.
 
  • Like
Reactions: bcsjkdfjksh