Tech Talk Thread Hardware for Rendering

Max Headroom

pauper commilito CtSH
Sep 27, 2017
1,499
7,665
I started this question about hardware in the -section and some guys felt disturbed.
So I collected that tech talk and post it here as a starter in quotes.

You don't have permission to view the spoiler content. Log in or register now.

Nevertheless I collect this talk and post it here for other newbees like me.
I hope you find this talk as useful as me.
 

micmitja

Forum Fanatic
Aug 6, 2017
5,731
6,068
I started this question about hardware in the -section and some guys felt disturbed.
So I collected that tech talk and post it here as a starter in quotes.
Never the less I collect this talk and post it here for other newbees like me.
I hope you find this talk as useful as me.
You worry too much polywog gave you good answer to that. There is good chance that this thread will get deleted anyway. Some ppl are just lazy and don't want to read.
 

Max Headroom

pauper commilito CtSH
Sep 27, 2017
1,499
7,665
You worry too much polywog gave you good answer to that. There is good chance that this thread will get deleted anyway. Some ppl are just lazy and don't want to read.
Deleted? Why? Too much quotes in one post or what?
It takes me 1/2 hour to filter them out... :eek:
 

Techn0magier

Well-Known Member
Jul 2, 2017
1,191
4,228
oO Don't worry. So let's talk tech.
Do you prefere a secondary card for rendering or do you guys use, the same card as for the monitor? And in addition, IRay allows the support of multiple NVidia Cards any experience with that either?
 
  • Like
Reactions: Max Headroom

Gomly1980

Forum Fanatic
Jul 4, 2017
4,479
7,087
Threads do tend to get deleted if they don't have a purpose and while it may serve you it's not a thread that's needed.

Clutter the mods call it.

The only thing you need to know about rendering images is the older your hardware the slower it will be and there is a chance you will burn out your rig.

It's happened before to folk.
 

Max Headroom

pauper commilito CtSH
Sep 27, 2017
1,499
7,665
Threads do tend to get deleted if they don't have a purpose and while it may serve you it's not a thread that's needed.

Clutter the mods call it.

The only thing you need to know about rendering images is the older your hardware the slower it will be and there is a chance you will burn out your rig.

It's happened before to folk.
The only thing I learned about computers in about 100 years is ONE thing:

The newer the OS is so much more slower is the machine!

New OS are dreck. Scripted like popcorn and with features no one really needs but drag the system.
Ohhh I want I could run DOS 6.0 on a machine from now!!! Small, tiny, clean...

But we are going off topic in the first posts...
 

Rich

Old Fart
Modder
Donor
Respected User
Game Developer
Jun 25, 2017
2,491
7,036
My setup has dual GTX 1080's. Yes, I splurged. Seriously so. But it renders really well. One of them has the monitor plugged into it, so there's a small performance degradation to the rendering, but it's not a big deal. Since the renders run in the GPU, there isn't a significant effect on the rest of the machine's performance - I can browse, etc., etc. without noticing it. But I usually run my renders in a batch overnight anyway. Does rather warm the room up, of course... LOL
 
  • Like
Reactions: Max Headroom

gerept

New Member
May 16, 2017
12
0
well i don't know...but i think you can pass from a program to another ....in some alchemical way of course....
 

NoesisAndNoema

Member
Game Developer
Oct 3, 2017
282
680
I use Daz3D 4.9, rendering with one "Titan X" (12GB non-pascal) video-card. {I have a LOT of large scenes and Daz is rather bloated with crap, when it comes time to render. Most 12GB scenes actually get reduced to about 4-8GB in the video-card, but if Daz thinks 12GB won't fit, it will simply fallback to CPU rendering, never even trying the GPU. Unless you setup a render-batch.}

Time to render a 4K scene, with just one model and adequate lighting, with minimal reflections and atmosphere... About 2-minutes to 24-hours... It's all about how much quality you want, in the end. Poor lighting leads to grainy shadows, but rendering 2x larger, if possible, nearly kills that. Well, after you do a noise-reduction and then resize it back to half-size, which is actually 100% of what you originally wanted.

I use dual monitors (Had three, but got tired of moving my mouse across all three screens) One uses the CPU's Intel-Video and the other uses the same Titan-X that I render with. I would comment about rendering and "using the computer at the same time", except I don't use the computer while rendering often. Daz is the hog, not the rendering, when rendering. (Rendering is threaded and independent of CPU actions, when it happens, when rendering with a GPU.)

However, Daz likes to monitor the whole process, which is what the treads are doing, when using the GPU to render. Total overkill. (There are ways to reduce that bloated overhead.)

Even when rendering on CPU, you can simply remove one processor (virtual-core), from Daz's grasp, and your computer runs nearly normal. As normal as it would run, with only one core running everything. (Computers spend 98% of the time doing nothing, ... fast!)
 
  • Like
Reactions: Max Headroom

fasteradov

New Member
May 14, 2017
4
0
My setup has dual GTX 1080's. Yes, I splurged. Seriously so. But it renders really well. One of them has the monitor plugged into it, so there's a small performance degradation to the rendering, but it's not a big deal. Since the renders run in the GPU, there isn't a significant effect on the rest of the machine's performance - I can browse, etc., etc. without noticing it. But I usually run my renders in a batch overnight anyway. Does rather warm the room up, of course... LOL
people say that video card from rendering the images lose their quality for several months . what you say is true or not. what is better dual GTX 1080 for $ 2,000 or RAISEN TR4 for $ 2,000 to render as you think .thank you ask for my English it's Google
 

Dhal9

Newbie
Jul 13, 2017
28
28
people say that video card from rendering the images lose their quality for several months . what you say is true or not. what is better dual GTX 1080 for $ 2,000 or RAISEN TR4 for $ 2,000 to render as you think .thank you ask for my English it's Google
I believe you are asking if video cards degrade over time and while the answer is technically yes its an imperceptably small amount where max over clock will go down a touch due to circuit degradation however this is about a 1% or less over say a year or two where most "losses" are to it being outpaced by newer models. As for whats better 2 1080 ti or 2 vega 56/64s I'd say for rendering and compute loads are where the vegas shine. Only problems are you will want either after market cooled cards or do a hybrid ala gamers nexus. Keep in mind your system will need a 1000W+ PSU as these things are thirsty oced under load. As far as cpu rendering I believe a high end threadripper will render a bit better on quality it just depends on your program.
 
  • Like
Reactions: fasteradov

Sam

Sysadmin
Staff member
Administrator
Dec 22, 2016
2,615
18,102
Threads rarely get deleted. The only time we delete posts is when it's derailing the discussion, eg; discussing sport in a game thread.

However, the "Programming and Development" category is a better fit for this discussion, as rendering isn't exclusively limited to BB, moving it there.
 
  • Like
Reactions: Not Me

Rich

Old Fart
Modder
Donor
Respected User
Game Developer
Jun 25, 2017
2,491
7,036
I believe you are asking if video cards degrade over time and while the answer is technically yes its an imperceptably small amount where max over clock will go down a touch due to circuit degradation however this is about a 1% or less over say a year or two where most "losses" are to it being outpaced by newer models. As for whats better 2 1080 ti or 2 vega 56/64s I'd say for rendering and compute loads are where the vegas shine. Only problems are you will want either after market cooled cards or do a hybrid ala gamers nexus. Keep in mind your system will need a 1000W+ PSU as these things are thirsty oced under load.
Part of doing this "properly" is making sure that your cards aren't being stressed out thermally too badly while you're rendering. When I started this, the first few times I kept a close eye on them with GPU-Z. With the on-board fans and the case cooling, they were staying well within their temperature specs. And the ones I bought were the Founders Edition from Geforce - they have a good rep for building cards that don't crap out.

But yes, GPU's used this way are current hogs, so you have to make sure you have enough power supply.

As far as cpu rendering I believe a high end threadripper will render a bit better on quality it just depends on your program.
That may very well vary by program. In the case of Daz Studio, my understanding is that the Iray implementation uses the exact same algorithms whether you're rending with GPU vs CPU, so there should be no quality difference if the two are rendered to the same endpoint. That may not be the case with all 3D programs...
 
  • Like
Reactions: fasteradov

Max Headroom

pauper commilito CtSH
Sep 27, 2017
1,499
7,665
Again a dumb question I'm really a noob in this matter:
I have (as I said before) a Intel Core 2 Quad CPU Q6600 2.40GHz with 4G RAM 64-bit. When I upgrade the RAM why should this computer not be sufficiant to render some pics with CPU?
OK the graphic card is lousy I know so don't talk about that.
I don't want to render a lot only some pics to test and learn for the beginning.
 

WBWB

Active Member
Jul 8, 2017
763
12,996
Haha I was already looking at GPU-Z on my 1060GTX 6 GB this morning before you wrote about it, Rich. I started a render and then took a look at the sensor info in GPU-Z. Good news: the temp slowly ramped up until it stabilized at 64C, fan speed likewise until 40%, power remained stable at 65% of TDP, and perfcap is due to VRel which is normal. Everything is nominal. I'm satisfied it's not going to burn out and now I'm not going to worry about it.
 

WBWB

Active Member
Jul 8, 2017
763
12,996
Again a dumb question I'm really a noob in this matter:
I have (as I said before) a Intel Core 2 Quad CPU Q6600 2.40GHz with 4G RAM 64-bit. When I upgrade the RAM why should this computer not be sufficiant to render some pics with CPU?
OK the graphic card is lousy I know so don't talk about that.
I don't want to render a lot only some pics to test and learn for the beginning.
The CPU will render fine, but slowly. It's true Iray is agnostic which means it's the same on GPU or CPU, so the quality of the final image should be the same, provided the max time setting in Daz is long enough for CPU to reach the target convergence that GPU would be able to hit much sooner.
 

Max Headroom

pauper commilito CtSH
Sep 27, 2017
1,499
7,665
So it's ONLY a question how FAST I want the picture 1 hour or 1 day?
BTW how long need a render in the quality of BB in average on faster computer (like Rich's for example)?
 

NoesisAndNoema

Member
Game Developer
Oct 3, 2017
282
680
When it comes to Daz3D and nVidia-IRay, in relation to CPU and GPU...

For GPU, you will hardly be "stressed-out". My card barely runs above 50% wattage/amps, when overclocked, processing renderings. It uses just the "Cuda-cores", which is what makes it faster than CPU rendering. (Graphics-cards only get taxed when playing games, because games use EVERY component in the GPU, for Direct-X processing. Cuda is, mostly, only used for Phys-X processing. Making 3D things collide. When rendering "Rays", it is calculating where beams collide onto a surface, not just lighting-up a pre-burned graphic, like video-games do, for high-FPS speed tricks.)

When it comes to CPU rendering, having insufficient RAM will result in a failure to render and Daz3D crashing. (Only ever seen if one item, itself, is larger than your max available RAM, or you have hit your virtual-memory limit. Keeping in mind that 64-bit systems use twice the memory-space of 32-bit systems. 4GB on a 64-bit system is essentially equal to 2GB in a 32-bit system. Not to mention that each 64-bit program version tends to consume more memory when running.)

For CPU rendering, it is the number of cores/threads, which is the largest determining factor in render-speeds. (Parallel processing)

Having a 1GHz CPU with 32 cores is 32x faster than a 3GHz CPU with only one core, for rendering. The math is complex, but not that complex... It is just long math, which finishes faster than the time it takes to read the data from RAM. This is ALSO why GPU's running GDDR5 are better than regulated and core-speed-locked system RAM at slower speeds and at only DDR3 limits.

When it comes to degrading, your CPU will die about 75% faster, even with the best cooling, as opposed to the untaxed GPU rendering. Your CPU will run 100%, throttling itself to sustain safe levels of heat dissipation, whether you have 1 or 60 cores.

By the way, a 60 core processor doesn't come close to matching 3000+ cuda-cores, in 1 Titan-X, for rendering speed.

However, if you render with something other than nVidia-IRay... The tables may turn. IRay is NOT the only rendering-engine out there. That is just what Daz3D uses, for free, to replace the outdated OpenGL rendering-engine, 3DeLight, which is still included with Daz3D. That is CPU-only... They don't use Cuda-Cores for that rendering-engine.

Plus, Titan-X's are a LOT cheaper than 60-core CPU's and the motherboards needed to run them.

NOTE: Titan-X is the ONLY video-card with 12GB that is one solid chunk of memory. The other cards with 12GB are actually (6GB one core)+(6GB second core). Thus, they have a 6GB scene-limit for GPU rendering. They can NOT share that memory as one large chunk. And, stupidly, nVidia refuses to simply use system RAM as a swap, unless you pay big bucks for them to release that forced limitation. (They want you to buy the super-expensive 24GB cards instead.)

nVidia Tesla P40 24GB video-cards... $11,000.00+
Link:
It only has 3840 CUDA Cores... Not much more than the Titan-X... Which sells for about $400.00 Used, each. (That would be about 27 Titan-X's, which would be 81000 Cuda-cores. For that same price as one Tesla-P40 card with only 3840 cuda-cores.)
 
Last edited: