Pc specs

Siramar

Member
Donor
Game Developer
Sep 13, 2017
146
208
Hi guys,
I'm looking at getting a new tower and the one I have my eye on has this specs,

Crystal Intel Core i7K Processor, GeForce RTX 2070 Graphics, 16Gb RAM, 2Tb HDD, 256Gb SSD.



Anyone have any advice or knowledge willing to share?

Thanks
 

Porcus Dev

Engaged Member
Game Developer
Oct 12, 2017
2,582
4,692
Hi guys,
I'm looking at getting a new tower and the one I have my eye on has this specs,

Crystal Intel Core i7K Processor, GeForce RTX 2070 Graphics, 16Gb RAM, 2Tb HDD, 256Gb SSD.



Anyone have any advice or knowledge willing to share?

Thanks
Just my opinion :p

- If you're not going to overclock, don't buy the K series.

- Rendering with Iray, the processor is not so important, maybe you can save something by buying an i5 (8th generation recommended) or Ryzen, and use that money to have better graphics or more RAM.

- Put 32GB of RAM if you can, or keep it in mind and if you put 16 leave free slots for a possible later expansion (you can always do this later).

- Buy a motherboard with two complete PCI-Express connectors for two graphics cards, maybe later it will be useful to you and does not increase the price much.

- And what I think is more important... there are DAZ scenes that can occupy a lot of VRAM memory, I think that if you can find a good offer, it will be better a 1080Ti with its 11GB than a 2070; I think that DAZ still does not take full advantage of the power of the new graphics, and on the other hand the price of 2080Ti is sky-high (you can buy two 1080Ti for the price of a 2080Ti, LOL).

Hope that helps.
 

xht_002

Member
Sep 25, 2018
342
352
also, not littlewoods, pick a prebuilt or custom Pc from

v12 finance is handed out to anyone
 

Siramar

Member
Donor
Game Developer
Sep 13, 2017
146
208
Thanks for your your help guys. looks like i'll just be buying a 1080 and 32gigs of ram and updating my existing tower for just over a grand, cheers!
 

gamersglory

Xpression Games
Donor
Game Developer
Aug 23, 2017
1,356
3,558
Thanks for your your help guys. looks like i'll just be buying a 1080 and 32gigs of ram and updating my existing tower for just over a grand, cheers!
You will want a 1080ti for the 11gb of VRAM when rendering. As for an RTX card, my 2080ti is quite a bit faster than my 1080ti on renders. Daz at some point will work fully with RTX cards as Opticx has been updated to fully support RT cores. Tensor cores are already supported by the latest Iray. Octane Render will be fully compatible with RTX in a few months
 

xht_002

Member
Sep 25, 2018
342
352
it would be faster to get 2x 1070 for less money and use SLI



with 2 identical cards, SLI with iray is 2x faster, then a single card
 

gamersglory

Xpression Games
Donor
Game Developer
Aug 23, 2017
1,356
3,558
it would be faster to get 2x 1070 for less money and use SLI



with 2 identical cards, SLI with iray is 50% faster, then a single card
For rendering You do not want to use SLI you want to run each card separately. unless you have two cards with NVlink. 10 series cards don't have NVlink
 

xht_002

Member
Sep 25, 2018
342
352
For rendering You do not want to use SLI you want to run each card separately. unless you have two cards with NVlink. 10 series cards don't have NVlink
you want to use SLI, the math is more precise, if cuda is used correctly which is why cuda is used for computer science simulations, instead of having to hire a full super computer, with multiple cores doing the same math, the process is less likely to timeout and skip on the floating point X.000000000000 value, which is better if a renderer is using simulated photon's with physic's for light

SLI with iray is alot faster, especially with volumetric lights and atmospheric effects
 

Rich

Old Fart
Modder
Donor
Respected User
Game Developer
Jun 25, 2017
2,490
7,035
you want to use SLI, the math is more precise, if cuda is used correctly which is why cuda is used for computer science simulations, instead of having to hire a full super computer, with multiple cores doing the same math, the process is less likely to timeout and skip on the floating point X.000000000000 value, which is better if a renderer is using simulated photon's with physic's for light

SLI with iray is alot faster, especially with volumetric lights and atmospheric effects
I don't know where you're getting these ideas from. GPU math doesn't change based on whether you have SLI enabled or not. The math is what the math is. The precision used depends on how the software is written, not whether you turn SLI on or off.

In addition, NVidia specifically recommends against using SLI for iRay, since it incurs a performance penalty.

This has been stated any number of times. Here's one example:



This has also been backed up by benchmark data. With SLI off, iRay will use each card independently perfectly well. SLI is basically used for gaming, where you can have one frame rendered on one card, while the next frame is rendered on the other card. iRay uses each card independently, and to its full capacity. The syncronization that SLI tries to enforce gets in the way of that.
 

xht_002

Member
Sep 25, 2018
342
352
I don't know where you're getting these ideas from. GPU math doesn't change based on whether you have SLI enabled or not. The math is what the math is. The precision used depends on how the software is written, not whether you turn SLI on or off.

In addition, NVidia specifically recommends against using SLI for iRay, since it incurs a performance penalty.

This has been stated any number of times. Here's one example:



This has also been backed up by benchmark data. With SLI off, iRay will use each card independently perfectly well. SLI is basically used for gaming, where you can have one frame rendered on one card, while the next frame is rendered on the other card. iRay uses each card independently, and to its full capacity. The syncronization that SLI tries to enforce gets in the way of that.
good for nvidia, SLI is always twice as fast on my PC, nvidia should'nt use dell's

i don't need to get information from anywhere, any renderer will use the same tricks when using SLI, it does'nt matter if its a game or ray tracing, cuda processor's do nothing but math, and the same goes for AMD stream processor's, which you need a vega GPU to use the same way you can take full control of how cuda can be used, which is why AMD have a cuda bridge on the dev site to convert cuda to openCL for vega cards

doing nothing more then floating point math for physic's and lighting vector's is the whole point to cuda/stream processor's, for more precision, and to take load of the CPU which are still better for real world physic's, which is where quantum computers are slower, and why they arn't fully complete and in use
 

gamersglory

Xpression Games
Donor
Game Developer
Aug 23, 2017
1,356
3,558
good for nvidia, SLI is always twice as fast on my PC, nvidia should'nt use dell's

i don't need to get information from anywhere, any renderer will use the same tricks when using SLI, it does'nt matter if its a game or ray tracing, cuda processor's do nothing but math, and the same goes for AMD stream processor's, which you need a vega GPU to use the same way you can take full control of how cuda can be used, which is why AMD have a cuda bridge on the dev site to convert cuda to openCL for vega cards

doing nothing more then floating point math for physic's and lighting vector's is the whole point to cuda/stream processor's, for more precision, and to take load of the CPU which are still better for real world physic's, which is where quantum computers are slower, and why they arn't fully complete and in use
Rendering with Octane Vs. Iray you may get a better performance increase with SLI. But Iray is a Lite Version of the whole Nvidia OptiX solution. Also, you may be confusing Iray and Mental Ray/Iray+ which is used paid programs like 3DS Max. Octane uses The CUDA cores better then Iray does
 

xht_002

Member
Sep 25, 2018
342
352
Rendering with Octane Vs. Iray you may get a better performance increase with SLI. But Iray is a Lite Version of the whole Nvidia OptiX solution. Also, you may be confusing Iray and Mental Ray/Iray+ which is used paid programs like 3DS Max. Octane uses The CUDA cores better then Iray does
rendering with iray which is bundled with Daz3d 4.10 free version is twice as fast when using SLI

as far as I remember, people started moaning 5 or 6 years ago when geforce cards started costing $600+ about being ripped off, so SLI was added to all version of iray
 

Joraell

Betrayed
Donor
Game Developer
Jul 4, 2017
2,472
8,772
rendering with iray which is bundled with Daz3d 4.10 free version is twice as fast when using SLI

as far as I remember, people started moaning 5 or 6 years ago when geforce cards started costing $600+ about being ripped off, so SLI was added to all version of iray
Man.
2*1080ti without sli bridge is faster or same fast as 2* 1080ti with sli pluged.
If you get 2x more performance with Sli bridge than with two separate cards, you have bad setting in Daz. Nothing more to say.
 
  • Like
Reactions: Porcus Dev

xht_002

Member
Sep 25, 2018
342
352
Man.
2*1080ti without sli bridge is faster or same fast as 2* 1080ti with sli pluged.
If you get 2x more performance with Sli bridge than with two separate cards, you have bad setting in Daz. Nothing more to say.
i have good settings in daz, i don't have optix acceleration enabled, it works and it does'nt :

NVIDIA OptiX

For Rapidly Building GPU Accelerated Ray Tracing Applications

OptiX does the “heavy lifting”:
traversal, intersection, acceleration, and (optionally) shading

OptiX handles the GPU aspects:
load balancing, parallelism, scaling, GPU optimization,
VCA client/server
page 8 >

optix prime, most of the time is no faster then just enabling SLI in the nvidia control panel
 

Joraell

Betrayed
Donor
Game Developer
Jul 4, 2017
2,472
8,772
i have good settings in daz, i don't have optix acceleration enabled, it works and it does'nt :



page 8 >

optix prime, most of the time is no faster then just enabling SLI in the nvidia control panel
There is nothing about Sli :) on page 8
 

xht_002

Member
Sep 25, 2018
342
352
There is nothing about Sli :) on page 8
optix prime manages the SLI load and what happens where and on what card, but its obviously for server's more then desktop, as it makes no difference to just enabling SLI in the control panel, which manages loads automatically, if a program has'nt coded its own pipelines
 

pavar

Member
Jul 3, 2017
166
273
if you do photo or video editing buy a bigger ssd, caching adobe programs to ssd reduces the time needed to apply filters enormously
 

Siramar

Member
Donor
Game Developer
Sep 13, 2017
146
208
Just my opinion :p

- If you're not going to overclock, don't buy the K series.

- Rendering with Iray, the processor is not so important, maybe you can save something by buying an i5 (8th generation recommended) or Ryzen, and use that money to have better graphics or more RAM.

- Put 32GB of RAM if you can, or keep it in mind and if you put 16 leave free slots for a possible later expansion (you can always do this later).

- Buy a motherboard with two complete PCI-Express connectors for two graphics cards, maybe later it will be useful to you and does not increase the price much.

- And what I think is more important... there are DAZ scenes that can occupy a lot of VRAM memory, I think that if you can find a good offer, it will be better a 1080Ti with its 11GB than a 2070; I think that DAZ still does not take full advantage of the power of the new graphics, and on the other hand the price of 2080Ti is sky-high (you can buy two 1080Ti for the price of a 2080Ti, LOL).

Hope that helps.
Picking your brains a bit more if you don't mind, I'm having a pc built and my tech guy recommends the
Nvidea Quadro processors over the 2080ti, (a P5000 I think he said?) has anyone any experience using these?

Thanks for your help
 

Porcus Dev

Engaged Member
Game Developer
Oct 12, 2017
2,582
4,692
Picking your brains a bit more if you don't mind, I'm having a pc built and my tech guy recommends the
Nvidea Quadro processors over the 2080ti, (a P5000 I think he said?) has anyone any experience using these?

Thanks for your help
Uhmmm... well, the Nvidea Quadro are good cards of course, also the model P5000 has 16GB of RAM (compared to the 11GB of the 1080Ti/2080Ti), but bear this in mind:
- The model P5000 is the old series, the new would be the RTX 6000 whose price is very high.
- For the price of a P5000 I got 3x1080Ti, and I'm sure I'll render faster with those 3 cards :p
- Currently DAZ does not take advantage of the RT technology of the latest cards, but announced that it would give support this year, and perhaps there will be a big increase in performance, it remains to be seen.

What to do? Well, that's very personal, but I would tell you, if you think the new RT technology is worth it (maybe someone can give you more technical details) don't buy a P5000, buy a 2080Ti or higher (although the RTX 6000 is very expensive!); and if you think the RT technology is not worth it or that it will take a long time to take advantage, don't buy a P5000, buy 2 or 3 1080Ti.
You see, in either case, I wouldn't buy the P5000.
 

Siramar

Member
Donor
Game Developer
Sep 13, 2017
146
208
Uhmmm... well, the Nvidea Quadro are good cards of course, also the model P5000 has 16GB of RAM (compared to the 11GB of the 1080Ti/2080Ti), but bear this in mind:
- The model P5000 is the old series, the new would be the RTX 6000 whose price is very high.
- For the price of a P5000 I got 3x1080Ti, and I'm sure I'll render faster with those 3 cards :p
- Currently DAZ does not take advantage of the RT technology of the latest cards, but announced that it would give support this year, and perhaps there will be a big increase in performance, it remains to be seen.

What to do? Well, that's very personal, but I would tell you, if you think the new RT technology is worth it (maybe someone can give you more technical details) don't buy a P5000, buy a 2080Ti or higher (although the RTX 6000 is very expensive!); and if you think the RT technology is not worth it or that it will take a long time to take advantage, don't buy a P5000, buy 2 or 3 1080Ti.
You see, in either case, I wouldn't buy the P5000.
Thanks for the input, I originally asked him for 2 RTX 2080Ti but when he came back with that suggestion i thought I'd ask around first
 
  • Like
Reactions: migoypedro