When it comes to card selection... I went back to the Daz3D forums for the benchmarks.
You must be registered to see the links
1080 TI is just a fraction faster than Titan-X Maxwell. Faster times were only seen with crazy CPU core speeds and core-counts. (Some people had 6, 8, 12, 20 and 44 core consumer/server CPU's for benchmark tests.)
My simple 4-core (8-thread), CPU renders the scene in 5 minutes 6.73 seconds.
While my Titan-X Maxwell, renders it in 2 minutes 49.69 seconds. (Without OCing my card.) 1 minutes 25.2 seconds, when pushed to thermal-max.
However, in real scenes, my card is about 5x to 12x faster than my CPU for render-times. (Not that half-assed benchmark demo of old Daz3D crap and shaders that no-one actually uses anymore.)
Titan-X Pascal was actually slower then both of those, at around the same speed as a 1070 or a 1080 GTX.
Ultimately, the 1080 TI gains speed only from the higher clock-rates, few more cuda-cores and higher voltage settings, over Titan-X. I could push my Titan-X to the same exact render-times, with a simple tweak with overclocking. However, by default and design, the 1080 TI also consumes about 30-50 more watts, per render, with only a 2% gain in speed. (That is 210watts for Titan-X Maxwell and 240-260watts when rendering with a 1080 TI. Measured at the wall, not after the CPU converts it to 12v.)
Also, Titan-X Maxwell with 12GB of VRAM can be found for $500, while 1080 TI cards are still around $800 and have only 11GB of VRAM.
Rendering 1000+ HD images, that translates into paying 20-35% more for your electric bill, which will hit $300+ a month, just for rendering. So, both long-term and on initial purchase, including operating hardware to run the cards in volume... Titan-X Maxwell is still the better all-around choice. Until they actually make a new format which runs with the same cuda-cores and less power consumption, for a reasonable price.
People seem to forget, or ignore that a 1200 watt PSU, operating at capacity, actually pulls around 1400 watts at the wall. Running batches of renders, especially for video, will sustain a constant load. Not to mention the fact that your home AC must now remove 1400 watts of heat from the air. That is 1.4 KWh + whatever it costs your AC to run, 4.4 KWh to remove the 1400 Watts of generated heat, plus your body heat and the days heat. (That is actually a savings in winter. Sort-of... Like running a 1400 Watt electric heater.)
Coming from a bitcoin mining background, I am ALL TOO FAMILIAR with this. I actually moved all my computers outside, where it is more humid. Though it was 98F (37C), it was a constant 98F, which computers don't mind. Plus, since computers can't "perspire", to lose heat, the moisture helps maintain a constant manageable temperature. The humidity in the air will not condense on hot components. It also helps to absorb the heat-radiation, and remove it from the cooling-fins, better than cold dry air does. No water-cooling was ever needed, running 8x Radeon 7970's, overclocked and running 24/7 in the summer heat in Florida.
With all of that said, I figured that I would re-post my current planned builds for a personal "render farm" computer. (This can be done a LOT cheaper, for just a remote render farm. Less RAM, cheaper CPU's, cheaper motherboards, ghetto cases, minimal hard-drive space... But, I need to use this computer as a full computer too, for setting-up renders. Thus, it is multi-purpose, render and creation.)
-------------------------------------------
Computer #1 (THE WET DREAM MACHINE)
- [$130] RAIJINTEK ASTERION SILVER CLASSIC, an Alu. E-ATX case
- - * {Because this is one of the cheapest and tested cases which fits the E-ATX motherboard and 4x video-cards.}
- [$160] 1600 watt PSU
- - * {Needs 12v for at-least (4x 6-pin), (5x 8-pin), (1x 4-pin), and (1x 4-pin-flat-connector/adapter)}
- [$410] MSI "X299 XPOWER GAMING AC" motherboard
- - * {Needs a CPU with 44 lanes to get PCIe x8/x8/x16/x8, for 4x "Titan-X Maxwells" to run.}
- [$2090] Intel Core i9-7980XE Skylake-X 18-Core 36-Threads
- - * {This has the 44 lanes required for running 4x Titan-X.}
- [$150] CPU liquid-cooler, 3x 120mm fans, 360mm x 120mm radiator.
- - * {Because a liquid-cooler is a requirement for these i9 2066 CPU's}
- [$660] Crucial Technology Ballistix Tactical 64GB (4x 16GB) 3000 MT/S DDR4 SDRAM (PC4-24000)
- - * {Because, why the fuck not!}
- [$2000] 4x Titan-X Maxwells, 12GB each of unified GDDR5 VRAM, non-SLI mode
- - * {For a crazy total of 12288 Cuda-Cores, for rendering. Nothing-else matters but the core-count and VRAM.}
- [$440] Samsung SSD 960 EVO NVMe M.2 1TB
- - * {This is for the greater majority of my 3D content, which I use frequently, and the OS.}
- [$150] 4TB HDD {No specific desired brand. Anything reliable.}
- - * {This is for the archives of additional graphics and 3D content, use a lot less frequently, and swap-files.}
TOTAL DAMAGE: $6190 USD
-------------------------------------------
Computer #2 (THE DAY DREAM MACHINE)
- [$100] Thermaltake Suppressor F51 Power Cover Edition E-ATX Mid Tower
- - * {Because this is the cheapest and tested cases which fits the E-ATX motherboard and 4x video-cards.}
- [$160] 1600 watt PSU
- - * {Needs 12v for at-least (4x 6-pin), (5x 8-pin), (1x 4-pin), and (1x 4-pin-flat-connector/adapter)}
- [$200] MSI "X99A-GODLIKE-GAMING" motherboard
- - * {Needs a CPU with 40 lanes to get PCIe x8/x8/x0/x16/x8, for 4x "Titan-X Maxwells" to run.}
- [$435] Intel Core i7-6850K Broadwell-E 6-Core 12-Threads
- - * {This has the 40 lanes required for running 4x Titan-X.}
- [$110] CPU liquid-cooler, 2x 120mm fans, 240mm x 120mm radiator.
- - * {A liquid-cooler is slight luxury here.}
- [$330] Crucial Technology Ballistix Tactical 32GB (2x 16GB) 3000 MT/S DDR4 SDRAM (PC4-24000)
- - * {Because, why the fuck not!}
- [$2000] 4x Titan-X Maxwells, 12GB each of unified GDDR5 VRAM, non-SLI mode
- - * {For a crazy total of 12288 Cuda-Cores, for rendering. Nothing-else matters but the core-count and VRAM.}
- [$150] 4TB HDD {No specific desired brand. Anything reliable.}
- - * {This is for the archives of additional graphics and 3D content, use a lot less frequently, and swap-files.}
TOTAL DAMAGE: $3485 USD
Honestly, I am not even sure if having 8x PCIe lanes is an actual requirement for rendering. The specs usually dictate "gaming minimums" for "running in SLI", as being some kind of actual minimum requirement. I know all cards can "run" with only 1x lanes. I know, because I ran all eight of my Radeon 7970oc cards on just a 1x PCIe extender and this crappy "i7 4790", for my bitcoin miners. (This CPU only has 16 lanes.) However, I wasn't pushing 12GB of images into each card. I know that 16x is overkill, because the 8x lanes never even get "saturated", (used completely, filled with actual data being transferred).
Better to have them all available, just in-case!