video card for iray render info

Cohibozz

Member
Aug 14, 2018
126
27
i've some problem for rendering in iray because i've a gpu ati and all render is cpu based. for 4/500 iteration in a complete scene 3840*2160 it take 8/1 hours.
too many to release a game in a good releasing update time.

i've to bought a gpu nvidia... but, what i've to see on gpu for iray render? can i buy a gpu for gaming like 1070/1080 or i've to take a quadro?
the ram amount (for gpu) is importat?
 

W22N

Member
Jan 5, 2018
186
653
You should try octane (at least the preview) if your current graphics card is decent, I believe they were compatible a while back.
Most people here use high end geforce cards, and I believe you can put the money to much better use with them. For daz you should care about cuda cores (determines the rendering speed) and Vram (daz loads the full scene in there no matter what you're doing, and even if you go with sli, each card has to be able to hold it).
Now if you look at a 1080ti, it offers 11gb of vram and 3580 cuda cores for 600-700 bucks, while in the quadro range, you have to look at a p6000 for a similar amount of cores and that goes for 5k retail.
If you have the money to spare a 2080ti would probably be the best if not 1080ti<1080 ...
 

Cohibozz

Member
Aug 14, 2018
126
27
i've see for cheapest gpu with gaming performance the 1060 is ok.
in this article it say it ids a good compromise for money safe.



i've foun this on amazon...

6gb vga for the price of 4gb ones.....




i've to do this buy?
 

Cohibozz

Member
Aug 14, 2018
126
27
i've seen on octane site and it say that is compatible with nvidia cuda card...i've a ati card...same problem that daz3d for me
 

Cohibozz

Member
Aug 14, 2018
126
27
online i've seen that with my i5-3570k i can render in cpu faster...i think i've to setup sometinhg better...if i have more then 1 single figure the time go at 10 hours for 400 iterations..
 

muttdoggy

Dogerator
Staff member
Moderator
Aug 6, 2016
7,793
44,797
i've seen on octane site and it say that is compatible with nvidia cuda card...i've a ati card...same problem that daz3d for me
Yup. Both need Nvidia cards. For IRay, you need at least 4gb of vram and the highest cuda core count you can get for the money you're willing to spend.
If you can't get the Nvidia card, you could try LuxCoreRender. It's free.
 

Deleted member 416612

The 'landlord'
Donor
Game Developer
Feb 2, 2018
923
3,925
i've some problem for rendering in iray because i've a gpu ati and all render is cpu based. for 4/500 iteration in a complete scene 3840*2160 it take 8/1 hours.
too many to release a game in a good releasing update time.

i've to bought a gpu nvidia... but, what i've to see on gpu for iray render? can i buy a gpu for gaming like 1070/1080 or i've to take a quadro?
the ram amount (for gpu) is importat?
Just asking but why would you render at that kind of resolution and not at, let's say, 1980x1080? I am asking because I am developing a game and I want to know if I missed something.
 

W22N

Member
Jan 5, 2018
186
653
Just asking but why would you render at that kind of resolution and not at, let's say, 1980x1080? I am asking because I am developing a game and I want to know if I missed something.
The standard is render in around 4k for a smaller amount of iterations and then downscale it to 1080, crunching it down covers up the missing pixels making it look than running a 1080 render for that amount of iterations. Overall it's about optimizing render times. There are some tutorials on the forum, they'll probably explain everything better than I could
 

xht_002

Member
Sep 25, 2018
342
353
the best option would be an RTX, as they destroy all modern pascal GPU, 1080ti and below

the RTX is the "future" and can do realtime raytracing in games, and the RTX ti is faster then a $68,000 ray tracing server
 

Cohibozz

Member
Aug 14, 2018
126
27
the best option would be an RTX, as they destroy all modern pascal GPU, 1080ti and below

the RTX is the "future" and can do realtime raytracing in games, and the RTX ti is faster then a $68,000 ray tracing server
ok...i've to release my game..find patron and buy rtx card! ok!!!
 

MaxCarna

Member
Game Developer
Jun 13, 2017
383
442
Put the RAM size over the CUDA amount, if the scene doesn't fit in the GPU, you won't be able to use any CUDA.

I used to have a 4gb card, I had great difficult to render any indoor scene with multiple light sources. Now with 8gb I can make almost any kind of scene, I only have problem with 8 characters in the same scene. Sometimes I have to use Scene Optimizer on those scenes, in order to fit on GPU.

You would probably have the same problem even with a 2080 that have the same size, 8gb. In the other hand, there are older Quadro cards with 24gb, it will be slower than one RTX, but probably can hold any size of scene. You have to consider what kind of scene you want to do.

Just an addendum: considering that 1060/1050 first models had only 3/2gb respectively, and then they released 6/4gb models, I still hope that they will launch new RTX models with larger RAM. There were a lot of speculations about 16gb models, maybe they postponed that for the next year.
 

xht_002

Member
Sep 25, 2018
342
353
ok...i've to release my game..find patron and buy rtx card! ok!!!
all RTX cards are under $1000, they are replacing the 1070s and 1080s, you will have to watch the boring nVidia presentation to find out why they are so cheap and yet faster then a full server

they are nothing special, its just a typical graphics card for next gen
 

W22N

Member
Jan 5, 2018
186
653
all RTX cards are under $1000, they are replacing the 1070s and 1080s, you will have to watch the boring nVidia presentation to find out why they are so cheap and yet faster then a full server

they are nothing special, its just a typical graphics card for next gen
That's an easy answer, they aren't. If you look at benchmarks they barely have an edge over the 10xx series, probably over time when their new technology starts being used you'll still have 6 months - a year waiting for nvidia to make drivers compatible with iray/octane, paying twice as much for something you'll have so long to take advantage of? Doesn't seem worth it.
 

xht_002

Member
Sep 25, 2018
342
353
That's an easy answer, they aren't. If you look at benchmarks they barely have an edge over the 10xx series, probably over time when their new technology starts being used you'll still have 6 months - a year waiting for nvidia to make drivers compatible with iray/octane, paying twice as much for something you'll have so long to take advantage of? Doesn't seem worth it.
they are 20 FPS faster on average in 4k, which makes 4k gaming on a single card a reality, as most games will run at 60FPS

and they will also cut 2/3rds of the development time when you are rendering 1000 or more images for a game at 1080p and above
 

Cohibozz

Member
Aug 14, 2018
126
27
Ok.. I ve bought the 1060 6 GB on Amazon for 250€ .

It a fantastic! I be done only one test at moment but I cook the same 400 iteration for a set

In CPU mode: 10/12 hours
With card 1060 6gb: 10 minutes!

Awesome!
 

W22N

Member
Jan 5, 2018
186
653
they are 20 FPS faster on average in 4k, which makes 4k gaming on a single card a reality, as most games will run at 60FPS

and they will also cut 2/3rds of the development time when you are rendering 1000 or more images for a game at 1080p and above

Right... Not like it's only a 30% improvement for double the price...

Ok.. I ve bought the 1060 6 GB on Amazon for 250€ .

It a fantastic! I be done only one test at moment but I cook the same 400 iteration for a set

In CPU mode: 10/12 hours
With card 1060 6gb: 10 minutes!

Awesome!
That's great, the vram isn't important when it comes to rendering times but it could be a limiting factor for you if you want to do big scenes, if you ever come across that you should use a scene optimizer like maxcarna suggested.
Good luck with that!
 

Cohibozz

Member
Aug 14, 2018
126
27
yeah i need it for vn game, at 90% i don't need to big scene. i'm happy with this at moment.
 
Jun 29, 2018
145
132
I built my current system about 8 months ago, just before getting into 3D. Video cards were still crazy expensive then so I paid a bit less for the GTX 1060 6GB card as well.

Overall I like it but I've run into problems doing a large scene. Not a major issue but I just have to not get too crazy setting up scenes.

I am getting ready to upgrade to a GTX 1080 TI but it's getting close to Black Friday so I'm trying to wait and see if I can save a bit of money. I'm interested in the Zotac Amp Extreme card but it's about $850 now.

I'd normally expect a good deal on a 2nd gen card but with the RTX not getting rave reviews and the possible new trade tarrifs getting put in place (did that get approved yet?) it may keep prices high.
 

OhWee

Forum Fanatic
Modder
Game Developer
Jun 17, 2017
5,883
29,894
Just a note on the RTX 20xx cards...

They DO work with Daz, and have been getting very decent render times as compared to their 10xx counterparts (say 50%-100% faster) BUT you need to use the latest public beta build of Daz. The 'official' version apparently doesn't support the new cards yet, but the beta version does.


Of course, at the end of the day it all comes down to budget, so the 2080 Ti might be more than you are willing to spend in the first place, then there's the question of whether getting 2 1080 Ti's is a better choice than a single 2080 Ti...

Looks like the cheapest 1080 Ti's are hovering at around $700 US at Newegg currently. So waiting for Black Friday might not be a bad strategy.
 
  • Like
Reactions: FranceToast
Jun 29, 2018
145
132
It's good to hear the RTX cards are working well with Daz 4.11. I'm not crazy about the idea of spending $1200 for a video card but there's also the issue of them being in short supply. If they are available around Black Friday I could be tempted. :)