GPU for Daz?

D-Droid

Newbie
Nov 3, 2018
59
33
Planning on getting new configuration to replace my old computer soon. My question is what would be better choice of gpu for rending in daz GTX 1660 ti or Nvidia Quadro PNY K1200 for around same price? And how good rendering could expect from any of those cards? Sorry if this is posted in wrong part of forum.
 

Estreiher

Newbie
Jan 16, 2020
84
94
From those two I'd go to the one that does have more ram memory onboard. If that's possible I wouldn't go lower then 1070/1080 nowadays. I have standard 1080 Geforce GTX with 8 gb ram and my regular renders (full scene, 2-3 lights) take about 2 hours to complete.
 

Saki_Sliz

Well-Known Member
May 3, 2018
1,403
1,011
GTX 1660 ti
Not even a fair comparison
the K1200 is even considered a legacy device by its own manufacture

the GTX 1660ti (by Nvidia Default, other manufactures tend to be faster than these numbers) is

150% memory and 360% faster (6G 288 GB/s vs 4G 80 GB/s ) (Click full spec text on the Nvidia page)
when compared to the K1200 Quadro

Render quality will be the same for both, it shouldn't depend on the cards, however, the renders will be done much faster on the GTX 1660ti, and if it is faster, than that means you can either increase how many things are in your renders, or it means you can crank up the quality/settings for slower renders but better images.

I also recommend looking into how to control Virtual VRAM. graphics cards like these don't really have enough memory, and will simply fail to render or render forever (or switch to CPU mode which is really slow) so often you have to use something like Virtual memory to help allow your graphics card to render an entire scene. That is assuming you are using Daz. Blender for example is what I use and automatically virtualizes memory, Daz however, you need a special program to do this but I don't know what that would be called.
 

Deleted member 1121028

Well-Known Member
Dec 28, 2018
1,716
3,308
Planning on getting new configuration to replace my old computer soon. My question is what would be better choice of gpu for rending in daz GTX 1660 ti or Nvidia Quadro PNY K1200 for around same price? And how good rendering could expect from any of those cards? Sorry if this is posted in wrong part of forum.
Short answer GTX 1660 ti, but it's not gonna perfom well with Iray. If you're about to buy a card, you should aim for any RTX card with ~8Go of VRAM. (Nearly all Iray updates done in Daz are RTX ones)
 
Last edited:

wurg

Active Member
Modder
Apr 19, 2018
705
1,654
I wouldn't recommend anything less than a 2060 super. You could get a 2070 with the same VRAM, but in the comparisons I've seen the 2060 super is faster, and it has the same VRAM. Also, from what reviewers have said the GPU die is better on the 2060 super than the 2070.

If you're willing to wait a little while longer Nvidia should be coming out with new cards later on this year, and you can probably save some money that way, or get the new gen for the same price maybe. I imagine if AMD comes out with comparable cards it might drive the cost down some depending on what they do, either way I think the 20 series will drop in price when the new gen comes out at least. It's what I'm waiting for to upgrade.
 

KiaAzad

Member
Feb 27, 2019
291
214
As far as I know Daz can't split the scene between multiple graphic cards, so adding a second card to speed up the renders isn't an option. try to get a card with enough ram for the scenes you want to render. and generally more cuda cores should means a faster render.
 

wurg

Active Member
Modder
Apr 19, 2018
705
1,654
As far as I know Daz can't split the scene between multiple graphic cards, so adding a second card to speed up the renders isn't an option. try to get a card with enough ram for the scenes you want to render. and generally more cuda cores should means a faster render.
As far as I know it doesn't split the scenes but will use the combined GPUs to render the scene speeding up the process. You will be limited to the VRAM limit of the card that has the least though, it doesn't combine the VRAM. Ex 1060 and a 1070 - you will be limited to 6 GB of VRAM for rendering.
 
  • Like
Reactions: Agrik

Deleted member 1121028

Well-Known Member
Dec 28, 2018
1,716
3,308
As far as I know it doesn't split the scenes but will use the combined GPUs to render the scene speeding up the process. You will be limited to the VRAM limit of the card that has the least though, it doesn't combine the VRAM. Ex 1060 and a 1070 - you will be limited to 6 GB of VRAM for rendering.
If your scene is 2Go and you got 3Go & 8Go card, it will render with both cards.
If your scene is 6Go and you got 3Go & 8Go card, it will render only with the 8Go one.

The only way to share VRAM is via nvlink but it's still limited to textures.

waterfox_2kBx5WtsxG.png
 

cooperdk

Engaged Member
Jul 23, 2017
3,504
5,163
I have a 1660ti and I am very happy with the render time. I don't know if getting a 2060 will be worth the extra money. It depends on lots of things. Do remember that it is possible to install an add-on so you can queue your renders.
 

khumak

Engaged Member
Oct 2, 2017
3,826
3,862
IMO the jump from a 6GB to an 8GB card for rendering would be quite noticeable and absolutely worth it. I would say until you get to 8GB, the amount of VRAM matters more than the speed. I've been delaying doing much more rendering until I can upgrade as well but I plan to wait for the 30 series cards to come out. Still kind of hoping for a 12GB card but I guess we'll see.

Multi GPU really isn't worth it IMO, better off buying a single faster card than using 2 slower ones. Scaling for multi GPU tends to scale very poorly (if at all). In some cases you actually get worse performance with 2 of the same card than you do with just 1 although I think for rendering you're more likely to see about 50% scaling for the 2nd card (and progressively less for each additional one).

Scene Optimizer is your friend for big renders unless you have a Titan or something.
 
  • Like
Reactions: TessSadist

Porcus Dev

Engaged Member
Game Developer
Oct 12, 2017
2,582
4,705
If you are not in a hurry to buy, during the next months the new 30xx series will be launched, the problem is that, if you are looking for a 3060, it's rumored, this will be the last one to be launched (first quarter of 2021 I think); the first ones will be the 3080Ti and 3080... by the way, the 3080Ti promises to be a real beast, although I guess its price will be too, lol
Especificaciones-GeForce-RTX-3080-Ti-GeForce-RTX-3080-GeForce-RTX-3070-y-GeForce-RTX-3060.png

In some cases you actually get worse performance with 2 of the same card than you do with just 1 although I think for rendering you're more likely to see about 50% scaling for the 2nd card (and progressively less for each additional one).
Better performance with 1 card than with 2? I don't think so. When I use the 2 cards I have, they render in half the time than when I use only 1 :p
 

khumak

Engaged Member
Oct 2, 2017
3,826
3,862
From all the info I've seen on both RDNA2 and Ampere, it sounds like both AMD and NVIDIA are going to be pulling out all the stops with their next gen cards. I really think waiting for the 30 series for Daz makes a lot of sense. Turing was NVIDIA being lazy and milking a huge tech lead. Ampere will (I think), be Nvdia showing us what they can really do when they're worried about the competition.
 
  • Like
Reactions: Porcus Dev

TessSadist

Well-Known Member
Donor
Game Developer
Aug 4, 2019
1,298
5,562
IMO the jump from a 6GB to an 8GB card for rendering would be quite noticeable and absolutely worth it. I would say until you get to 8GB, the amount of VRAM matters more than the speed. I've been delaying doing much more rendering until I can upgrade as well but I plan to wait for the 30 series cards to come out. Still kind of hoping for a 12GB card but I guess we'll see.

Multi GPU really isn't worth it IMO, better off buying a single faster card than using 2 slower ones. Scaling for multi GPU tends to scale very poorly (if at all). In some cases you actually get worse performance with 2 of the same card than you do with just 1 although I think for rendering you're more likely to see about 50% scaling for the 2nd card (and progressively less for each additional one).

Scene Optimizer is your friend for big renders unless you have a Titan or something.
I think this is my plan to go from 6 to at least 8 when I can, as 6 was very difficult for me with my first release. I was hitting a CPU render over and over, and using Scene Optimizer to tweak down specific objects piece by piece to get just under 6. That was easily the typical experience for a majority of the renders.
 
  • Like
Reactions: khumak

khumak

Engaged Member
Oct 2, 2017
3,826
3,862
I think this is my plan to go from 6 to at least 8 when I can, as 6 was very difficult for me with my first release. I was hitting a CPU render over and over, and using Scene Optimizer to tweak down specific objects piece by piece to get just under 6. That was easily the typical experience for a majority of the renders.
Yeah that's the same issue I'm having when modding except my card only has 4GB so it's even worse. I decided to put rendering on hold until I upgrade to Ampere hopefully later this year. From the memory utilization I've seen on some of the renders I've tried it would be nice to have a card with 16GB of VRAM since some of my renders have gone as high as that (before optimizations) but I think 8-12GB would work for most of them. I can usually get my memory under 4GB with scene optimizer but sometimes the image degradation is what I would consider unacceptable for anything other than modding.

One little tidbit I've heard about Ampere is that it sounds like there is a new and improved form of lossless compression they're going to use so that might help a bit. No idea if it will be backwards compatible for their older hardware. I'm still leaning towards getting the 80Ti though for the extra memory.

I've also heard that raytracing is getting a massive boost with Ampere and some of that might apply to Iray rendering. It sounds like the 60 series cards will be on part with the 2080Ti for ray tracing (not for rasterization though). DLSS is supposedly also going to 3.0 and won't require specific, per app coding to work, it'll just work with everything (although not sure if it'll require Ampere or if it's software compatible with older cards). Would be interesting to see if DLSS gets to the point where we could render at a lower rez and then upscale with DLSS and get good results.

Ironically my most recent info on Ampere comes from a guy who is a bit of an AMD fanboy and even he says that yeah Ampere is going to be a monster.
 

D-Droid

Newbie
Nov 3, 2018
59
33
Found great deal for HP Omen i7 9750H 16 GB Ram and Rtx 2070, but not sure if laptop would be good option for rendering. Maybe someone with more experience with this could help decide?
 

Deleted member 1121028

Well-Known Member
Dec 28, 2018
1,716
3,308
Found great deal for HP Omen i7 9750H 16 GB Ram and Rtx 2070, but not sure if laptop would be good option for rendering. Maybe someone with more experience with this could help decide?
Would say it's quite good.

>laptop
But why you want to render with a laptop?
When you render, your GPU is 100% melting mode. You'll reach ~78 ~80°c within a minute.
Imagine hours of that each days on a laptop. :geek: