Tech Talk Thread Hardware for Rendering

Max Headroom

pauper commilito CtSH
Sep 27, 2017
1,499
7,665
So far so good I'm configuring my new machine and decided to use a GTX 1080 but wich brand? Any suggestions because there are so many different types. I thought about AMD or MSI they were good brands in the past but then there're still a lot models to choose.
 

NoesisAndNoema

Member
Game Developer
Oct 3, 2017
282
680
I use RAM-Drive for my internet browser-cache, my ONE game I play (whichever is the current speed-hog), and for my art-program's temp-memory. (It uses horrible UNDO storage on the slow HD otherwise.)

For Daz, I can't imagine any real gains. Unless you have a small library of objects. However, that would only speed-up how long it takes Daz to load the scene, a little. Daz has a huge database overload and horrible pre-processing, which is the time-killer.

For rendering, it would not speed it up at all. Daz spoon-feeds images to the video-card. If it simply told the video-card where the image was, on the drive, and then let the card load it... Then we would see gains. However, Daz reads the images from RAM/Swap, once loaded. Then, when you hit render, they go through a "limits and quality" redraw. After that, Daz feeds one image at a time to the video-card, waiting for it to reply with an "Okay, it fit", before feeding it another one. (It also checks remaining space, because nVidia reformats the image into something it can use. Similar to a bitmap. Since PNG and JPG are not formats actually used internally, as they need to be formulated to create an image, and that is too slow for multiple-pass layered rendering of shaders.)

P.S. There is a trick to rendering scenes faster in Daz too...
1: Render something once, and cancel it... BUT... Leave the rendered image up.
2: Make your changes to the scene, as desired...
3: Render again... (The scene is already loaded in the video-card still. I should just start rendering with the new settings.)

Not sure if that works when you add things to a scene. (May just push the new items, or it may decide to reload the whole thing again.) However, keeping that first rendered item up, keeps the things in memory. It assumes you will be rendering it again, with changes... Like an animation or just another "tweak". Unless you close that render window, then it assumes you are just "done", and it unloads the card memory for whatever "new" thing you are about to feed it.
 

NoesisAndNoema

Member
Game Developer
Oct 3, 2017
282
680
So far so good I'm configuring my new machine and decided to use a GTX 1080 but wich brand?
I would suggest looking at actual benchmark scores for GPUmark, CUDA related... As well as user reviews of "troubles" with various brands.

Honestly, a potato is a potato, unless it is sliced into strips, sheets, shreaded or diced into small blocks... Just don't forget to add salt and ketchup.

Seriously though, the only variations are the RAM-types/speed, clock-speeds, and cooling styles.

Don't overclock, it doesn't have any real gains. (Sort-of critical for FPS games)
RAM-type/speed is marginally significant, for renders. (Critical for FPS games)
Cooling styles are not real important for Cuda-Rendering. (All are sufficient. Noise will be your only issue. Blowers are noisy and hard to clean. Free-air blades are quiet and tend to get less dirty. They have lower static pressure and usually have wider surface cooling area. However, they overheat your computer-case internals, due to poor venting.)

As for the rest... They all have to follow the nVidia specs to run. Most are made in the same exact place and just re-branded. The only things they can actually change is the quality of other components, and mountings. Paying 50% more for components that are 10% better than specifications, which will outlive the GPU itself, is not worth the added cost, when it moves the same amount of dirt at the end of the day, and what will kill the card is natural chip degradation after running for x-hours, at constant load.

Also, looks of the card and colorful packages should not be counted... or marketing... :p

At the end of the day, it is all about the numbers... The numbers of the output, in relation to the cost of the card.

Life-span: Running 100%, my "coin miners", running for three solid years, 24/7, never died or degraded with any noticeable "benchmarked decay", or "visual decay". They were various brands, but each, even among the same brand, all ran at unique speeds and with unique limits. All were overclocked to the maximum and ran at full heat-throttle. (MSI, ZOTAC, ASUS, EVGA, and GIGABYTE) * I mined various alt-coins, not actual bitcoins. I had 1.6 Terahash miners for bitcoins.

Now I don't have a use for my 1600-Watt PSU's... Sold all the cards. :p
 
Last edited:

OhWee

Forum Fanatic
Modder
Game Developer
Jun 17, 2017
5,731
29,111
OK, so I tried something with Daz Studio. I assigned the temporary files, DSON Cache Files, and CMS Cluster Directory (Edit>Preferences) to a Ramdisk. It seems to be helping, but I'm not quite sure.

What I think I'm seeing is faster browsing through the library/content/product folders, and I think that my project files are loading a bit faster (the .duf files are still on my ssd, but perhaps it's finding the location of the various parts a bit faster). Essentially the 'scene' seems to be appearing in the viewport a bit more immediately once it is fully loaded, but that may just be my imagination.

The CMS Directory for my install was 1.27 GB, the two other directories were much smaller.

Anyways, if you are bored, might be worth a look.

Also, IMDisk is available for free via sourceforge, and supposedly doesn't cap the ramdisk size.


I'm working with the Radeon Ramdisk 4 GB at the moment, so I haven't played with IMDisk as of yet.

Something else that I finally did was to 'map out' the My Library folder into categories. This sorts the My Library folder items similar to how the default folder is sorted under Categories, under it's own heading, to make the stuff that you load into My Library (as opposed to My Daz 3D Library) easier to find.

This is actually pretty easy to do if you haven't done it already.
Right Click on the My Library Folder,
then choose Create Category from > Selected Folder and Subfolders.

It'll take a few minutes to do the sort, and Daz may seem to freeze while it's doing this, but once it's done, voila!
 
Last edited:

Max Headroom

pauper commilito CtSH
Sep 27, 2017
1,499
7,665
@NoesisAndNoema and all others. I'm very thankful for your tips and maybe you read that I'm out of that business for over 10 years. I'm like a newborn calf at the moment and have to learn a lot. I'm just overwelmed with this mass of information and opportunities.
I looked at the links of @Cirro84 and most time it looks like Chinese for me. :D

I think it will take days or weeks to decide which card, than the MB, GPU and all other stuff... x'D
 
Last edited:
  • Like
Reactions: Cirro84

bacchusplateau

Well-Known Member
May 23, 2017
1,183
1,089
@NoesisAndNoema and all others. I'm very thankful for your tips and maybe you read that I'm out of that business for over 10 years. I'm like a newborn calf at the moment and have to learn a lot. I'm just overwelmed with this mass of information and opportunities.
I looked at the links of @Cirro84 and most time it looks like Chinese for me. :D

I think it will take days or weeks to decide which card, than the MB, GPU and all other stuff... x'D
Been out of the loop for 10years+?

Your avatar and name says it all my friend. I'm sure you were around when phones were screwed onto the wall and you can only go as far as the cord would let you. ;)

Prove the world wrong and teach this old dog some new tricks!
 

Max Headroom

pauper commilito CtSH
Sep 27, 2017
1,499
7,665
I'm sure you were around when phones were screwed onto the wall and you can only go as far as the cord would let you.
No phone... drums!:closedeyesmile:
@bacchusplateau but your avatar is much older. What should I think about you? ;)
Yes I bought my current running computer about 10 years ago and with that I stopped to deal with that matter. So my knowledge is slightly "out of date". :closedeyesmile:
 

NoesisAndNoema

Member
Game Developer
Oct 3, 2017
282
680
Just google "CUDA cores", and the top (reasonably-affordable), video-cards you are thinking about...

1: That is the bottom line, for speed {With iRay}. (Every second counts, when seconds turn into hours/days/weeks.)
2: RAM is only the bottom-line for scene-complexity. (There are ways to make scenes less top-heavy and in a pinch, you can render things individually. Thus, the primary importance of #1)


 
Last edited:

OhWee

Forum Fanatic
Modder
Game Developer
Jun 17, 2017
5,731
29,111
So I switched over to IMDisk, and am currently running with an 8 GB drive. Seems to be going well so far.

As I said in my previous post, I switched the Daz Studio caches over to the ramdisk. I think it's helping more than a bit with product browsing response times. And of course my internet temp files are now on the ramdisk.
 

OhWee

Forum Fanatic
Modder
Game Developer
Jun 17, 2017
5,731
29,111
Have you tried moving the actual database itself?
I assigned the temporary files, DSON Cache Files, and CMS Cluster Directory (Edit>Preferences) to a Ramdisk.

Those are the only ones I saw in the Preferences section.
 

Max Headroom

pauper commilito CtSH
Sep 27, 2017
1,499
7,665
I'm still struggling with the graphic card:
sure a 1080 but TI or not? Is the TI the higher price worth (~350 bucks) ?

Has somebody both in direct competition?
 

OhWee

Forum Fanatic
Modder
Game Developer
Jun 17, 2017
5,731
29,111
I'm still struggling with the graphic card:
sure a 1080 but TI or not? Is the TI the higher price worth (~350 bucks) ?

Has somebody both in direct competition?
Jumping from a 1080 to a 1080 Ti, yes. The increase in performance is significant, and you'll get the extra VRAM too. Looks like they are starting around $750 on Newegg, and I spotted this liquid/air hybrid (liquid cooling for the GPU, air cooling for the vram) for $799

You probably don't want to overclock, since rendering tends to punish GPUs. Keep that in mind when picking the card (i.e. you might not want to pay extra for an overclocked version).

The 1080's aren't too shabby, but if you are serious about doing this rendering thing, might as well get the better card now. The Titan Xp's are another story - almost double the price for only single digit gains in performance is probably not worth it...
 
Last edited:
  • Like
Reactions: Max Headroom

Cirro84

Resident Evil-doer, part-time Candyman
Dec 24, 2016
1,435
1,461
... new minimal specs for the dForce Simulation of DAZ 4.10?
According to links:
...at least a 'Fermi' GPU (GeForce 400/500) is necessary, or newer chips ('Kepler', GeForce 600-800), plus newer / newest drivers (if possible).
You don't have permission to view the spoiler content. Log in or register now.

The rest should still be 'common specs' for DAZ:
You don't have permission to view the spoiler content. Log in or register now.
 
  • Like
Reactions: Techn0magier

NoesisAndNoema

Member
Game Developer
Oct 3, 2017
282
680
When it comes to card selection... I went back to the Daz3D forums for the benchmarks.


1080 TI is just a fraction faster than Titan-X Maxwell. Faster times were only seen with crazy CPU core speeds and core-counts. (Some people had 6, 8, 12, 20 and 44 core consumer/server CPU's for benchmark tests.)

My simple 4-core (8-thread), CPU renders the scene in 5 minutes 6.73 seconds.
While my Titan-X Maxwell, renders it in 2 minutes 49.69 seconds. (Without OCing my card.) 1 minutes 25.2 seconds, when pushed to thermal-max.

However, in real scenes, my card is about 5x to 12x faster than my CPU for render-times. (Not that half-assed benchmark demo of old Daz3D crap and shaders that no-one actually uses anymore.)

Titan-X Pascal was actually slower then both of those, at around the same speed as a 1070 or a 1080 GTX.

Ultimately, the 1080 TI gains speed only from the higher clock-rates, few more cuda-cores and higher voltage settings, over Titan-X. I could push my Titan-X to the same exact render-times, with a simple tweak with overclocking. However, by default and design, the 1080 TI also consumes about 30-50 more watts, per render, with only a 2% gain in speed. (That is 210watts for Titan-X Maxwell and 240-260watts when rendering with a 1080 TI. Measured at the wall, not after the CPU converts it to 12v.)

Also, Titan-X Maxwell with 12GB of VRAM can be found for $500, while 1080 TI cards are still around $800 and have only 11GB of VRAM.

Rendering 1000+ HD images, that translates into paying 20-35% more for your electric bill, which will hit $300+ a month, just for rendering. So, both long-term and on initial purchase, including operating hardware to run the cards in volume... Titan-X Maxwell is still the better all-around choice. Until they actually make a new format which runs with the same cuda-cores and less power consumption, for a reasonable price.

People seem to forget, or ignore that a 1200 watt PSU, operating at capacity, actually pulls around 1400 watts at the wall. Running batches of renders, especially for video, will sustain a constant load. Not to mention the fact that your home AC must now remove 1400 watts of heat from the air. That is 1.4 KWh + whatever it costs your AC to run, 4.4 KWh to remove the 1400 Watts of generated heat, plus your body heat and the days heat. (That is actually a savings in winter. Sort-of... Like running a 1400 Watt electric heater.)

Coming from a bitcoin mining background, I am ALL TOO FAMILIAR with this. I actually moved all my computers outside, where it is more humid. Though it was 98F (37C), it was a constant 98F, which computers don't mind. Plus, since computers can't "perspire", to lose heat, the moisture helps maintain a constant manageable temperature. The humidity in the air will not condense on hot components. It also helps to absorb the heat-radiation, and remove it from the cooling-fins, better than cold dry air does. No water-cooling was ever needed, running 8x Radeon 7970's, overclocked and running 24/7 in the summer heat in Florida.

With all of that said, I figured that I would re-post my current planned builds for a personal "render farm" computer. (This can be done a LOT cheaper, for just a remote render farm. Less RAM, cheaper CPU's, cheaper motherboards, ghetto cases, minimal hard-drive space... But, I need to use this computer as a full computer too, for setting-up renders. Thus, it is multi-purpose, render and creation.)

-------------------------------------------

Computer #1 (THE WET DREAM MACHINE)
- [$130] RAIJINTEK ASTERION SILVER CLASSIC, an Alu. E-ATX case
- - * {Because this is one of the cheapest and tested cases which fits the E-ATX motherboard and 4x video-cards.}

- [$160] 1600 watt PSU
- - * {Needs 12v for at-least (4x 6-pin), (5x 8-pin), (1x 4-pin), and (1x 4-pin-flat-connector/adapter)}

- [$410] MSI "X299 XPOWER GAMING AC" motherboard
- - * {Needs a CPU with 44 lanes to get PCIe x8/x8/x16/x8, for 4x "Titan-X Maxwells" to run.}

- [$2090] Intel Core i9-7980XE Skylake-X 18-Core 36-Threads
- - * {This has the 44 lanes required for running 4x Titan-X.}

- [$150] CPU liquid-cooler, 3x 120mm fans, 360mm x 120mm radiator.
- - * {Because a liquid-cooler is a requirement for these i9 2066 CPU's}

- [$660] Crucial Technology Ballistix Tactical 64GB (4x 16GB) 3000 MT/S DDR4 SDRAM (PC4-24000)
- - * {Because, why the fuck not!}

- [$2000] 4x Titan-X Maxwells, 12GB each of unified GDDR5 VRAM, non-SLI mode
- - * {For a crazy total of 12288 Cuda-Cores, for rendering. Nothing-else matters but the core-count and VRAM.}

- [$440] Samsung SSD 960 EVO NVMe M.2 1TB
- - * {This is for the greater majority of my 3D content, which I use frequently, and the OS.}

- [$150] 4TB HDD {No specific desired brand. Anything reliable.}
- - * {This is for the archives of additional graphics and 3D content, use a lot less frequently, and swap-files.}

TOTAL DAMAGE: $6190 USD

-------------------------------------------

Computer #2 (THE DAY DREAM MACHINE)
- [$100] Thermaltake Suppressor F51 Power Cover Edition E-ATX Mid Tower
- - * {Because this is the cheapest and tested cases which fits the E-ATX motherboard and 4x video-cards.}

- [$160] 1600 watt PSU
- - * {Needs 12v for at-least (4x 6-pin), (5x 8-pin), (1x 4-pin), and (1x 4-pin-flat-connector/adapter)}

- [$200] MSI "X99A-GODLIKE-GAMING" motherboard
- - * {Needs a CPU with 40 lanes to get PCIe x8/x8/x0/x16/x8, for 4x "Titan-X Maxwells" to run.}

- [$435] Intel Core i7-6850K Broadwell-E 6-Core 12-Threads
- - * {This has the 40 lanes required for running 4x Titan-X.}

- [$110] CPU liquid-cooler, 2x 120mm fans, 240mm x 120mm radiator.
- - * {A liquid-cooler is slight luxury here.}

- [$330] Crucial Technology Ballistix Tactical 32GB (2x 16GB) 3000 MT/S DDR4 SDRAM (PC4-24000)
- - * {Because, why the fuck not!}

- [$2000] 4x Titan-X Maxwells, 12GB each of unified GDDR5 VRAM, non-SLI mode
- - * {For a crazy total of 12288 Cuda-Cores, for rendering. Nothing-else matters but the core-count and VRAM.}

- [$150] 4TB HDD {No specific desired brand. Anything reliable.}
- - * {This is for the archives of additional graphics and 3D content, use a lot less frequently, and swap-files.}

TOTAL DAMAGE: $3485 USD

Honestly, I am not even sure if having 8x PCIe lanes is an actual requirement for rendering. The specs usually dictate "gaming minimums" for "running in SLI", as being some kind of actual minimum requirement. I know all cards can "run" with only 1x lanes. I know, because I ran all eight of my Radeon 7970oc cards on just a 1x PCIe extender and this crappy "i7 4790", for my bitcoin miners. (This CPU only has 16 lanes.) However, I wasn't pushing 12GB of images into each card. I know that 16x is overkill, because the 8x lanes never even get "saturated", (used completely, filled with actual data being transferred).

Better to have them all available, just in-case!
 
Last edited:

Cirro84

Resident Evil-doer, part-time Candyman
Dec 24, 2016
1,435
1,461
When it comes to card selection... I went back to the Daz3D forums for the benchmarks. ...
Honestly, I am not even sure if having 8x PCIe lanes is an actual requirement for rendering. The specs usually dictate "gaming minimums" for "running in SLI", as being some kind of actual minimum requirement. ... I know, because I ran all eight of my Radeon 7970oc cards on just a 1x PCIe extender and this crappy "i7 4790", for my bitcoin miners. ...
I know that 16x is overkill, because the 8x lanes never even get "saturated", (used completely, filled with actual data being transferred). Better to have them all available, just in-case!
Oh, coin mining you did. I see where you come from. Suffered much?? XD

Exactly, more than 8 lanes per card really are not necessary, with such bundles you should be fine. Better use a bundle for fast M.2 SSDs I'd say. The system needs to update progress too, and transfer data from / to the cards (which is done with processing support from CPU). It's most important to 'feed' the cards well I guess. Each 8x PCIe 3.x lane bundle gives aggregated speeds of about 7.88 gt/s (8 bits at once, for 0.985 gb/s per lane) in theory, should suffice for even top tier cards.
As matter of fact, connections through 'chipset'/'controller hub' are of roughly the same bandwidth - Intel's , with ca. 7.86 gt/s link speed in total (in theory, throughput for both directions), aggregated speed of about 3.93 gb/s for one transfer on four lanes. Not that unusual, given the link connects between CPU and chipset logic derive from techniques first introduced with PCIe, from 2004 onwards iirc. Half an eternity. Chipsets for Intel P4/EE/Core Duo/Quad (socket LGA 775) and Athlon 64/X2/FX boards (socket 939, 754) featured those new PCIe connects back then...

Also you should mention what you plan to avoid noise. More than 2 cards plus a performant CPU cause an amount of heat and noise nobody should underestimate, you as a miner pretty much know about these difficulties I guess? How will you handle that problem??
Heat is one thing, you perhaps can't escape it. But I personally cannot imagine 3 or 4 video cards working together with a 100+ watts min 8-core CPU without me going up the wall after a while, no matter how 'quiet' I can build a system. If the components would all go full force on renders I'd probably want to settle down next door? A bit of space all around is a must, and I can't place the workstation somewhere enclosed. :/
 

NoesisAndNoema

Member
Game Developer
Oct 3, 2017
282
680
Oh, coin mining you did. I see where you come from. Suffered much?? XD
I have made several thousands of dollars... Didn't suffer a bit. I lucked-out and spent most of it before the "Gox crash/theft".

My first set of bitcoin miners was 4x Radeon 7970 OC edition "reference-cooler" style cards. (Mini dust-busters.) They ran overclocked to the maximum that they could handle. The damn thing sounded like a super dust-buster, on steroids.

Then, I managed to get 4x more of those same cards with the raised super-coolers. (Two silent fans with tuned air-flow blades, blowing across a quad-piped, dual-surface heat-exchanger. They ran soooo much quieter!)

I will have no such luck with the Titans... They are all built with reference design, or water-cooling blocks. Later, I may get the water-coolers. However, rendering full-force, the fans only kick-up to half-speed, even with my high-tuned fan profile. I set the fans set to a more realistic curve, for rendering. They turn on faster and remain on longer.

You are right, I am thinking about how to dampen the sound, in my design.

I want to build the custom case inside a window-mount AC-unit. (Gutted of the AC components.) For the noise reason, as well as for the heat reasons. 2000 watts is a lot of heat to try to remove with any home AC unit, when running for a long time.

I ran my miner in the garage, for the winter. It helped to lower our heating/electric bill, a little... (Pre-heating the hot-water lines and tank. Pre-heating the air for the dryer intake. Keeping the garage-walls and attic air warm and dry against the house. Etc...)

For the summer, it sat out on my back porch, behind a hard-wood bar. It enjoyed the humidity, which helped the cards disperse heat better. The cards and the motherboards. (The units were open-air design, built in milk-crates.)

When it comes to the bus-lanes... I honestly only "need" 8x, only for real-time updates in Daz3D... Like this video, but faster! (Keep in mind, he is rendering, live, while recording HD video-captures.)

For comparison, this is 2x Titan-X's...

For 1x Titan-X, I can confirm that it is half as fast as the video above, for instant displays. I used that same model setup. It hangs a LOT after making ahcnges, locking-up the UI and struggling to refresh. It was tolerable, but 4x will surely be more tolerable. :p
 
Last edited: