RX 2080 Ti Crashing

MaxCarna

Member
Game Developer
Jun 13, 2017
383
442
Does anyone here had experienced this problem? Heard something about this?



With the current price, it's possible to buy 2x 1080TI for the same price, imagine with this risk
 

W22N

Member
Jan 5, 2018
186
653
It's not just the 2080ti, it's basically every 20xx FE card out there, no-one knows what's causing it so I won't pretend to either.
One thing of note though, if your card dies they will replace it (you know warranties and all that), on top of that 1080tis are burning stock as it is so I don't think crashes should affect long term planning.
 

xht_002

Member
Sep 25, 2018
342
353
its the same old news, founders editions are for dells which arn't overclocked with the internal temperatures of the boxes on the edge of melting the alloy, EVGA had the same problem with the 1080, but they actually caught fire because the VRM was in the wrong place of the PCB
 
Jun 29, 2018
145
132
I just upgraded my video card this week and I picked a horrible time to do it. The new 2080 Ti is way over priced, hard to find, and having a lot of bad news about problems. The 1080 Ti's have stopped being produced so quantity is limited and prices are jumping.

I feel lucky to have found a Gigabyte Aorus 1080 Ti for about $770 on New Egg so I snatched it up. I've been seeing a number of 1080 Ti's being priced around $1K now!

My old card was a GTX 1060 (6 Gb) card and I'm shocked at how much better the 1080 Ti is! Not only are my render times about 1/2 of what they were but content/scenes load so much quicker. It makes using Daz Studio so much nicer.

If you need to upgrade and can't wait for more positive news about the 2080's I'd say go for a 1080 Ti asap as I don't think they are going to get any cheaper.
 

Porcus Dev

Engaged Member
Game Developer
Oct 12, 2017
2,582
4,705
I just upgraded my video card this week and I picked a horrible time to do it. The new 2080 Ti is way over priced, hard to find, and having a lot of bad news about problems. The 1080 Ti's have stopped being produced so quantity is limited and prices are jumping.

I feel lucky to have found a Gigabyte Aorus 1080 Ti for about $770 on New Egg so I snatched it up. I've been seeing a number of 1080 Ti's being priced around $1K now!

My old card was a GTX 1060 (6 Gb) card and I'm shocked at how much better the 1080 Ti is! Not only are my render times about 1/2 of what they were but content/scenes load so much quicker. It makes using Daz Studio so much nicer.

If you need to upgrade and can't wait for more positive news about the 2080's I'd say go for a 1080 Ti asap as I don't think they are going to get any cheaper.
I totally agree, I also upgraded my 1060 6Gb with a new 1080 Ti and it's fantastic. :heartcoveredeyes:
As you said, renders are faster but its 11GB of VRAM is the best! (More if you use W7...W10 steal 2GB for it's own).
And I'm also happy because I have the old 1060 also connected, so when the scene doesn't take up much memory they both work at the same time.

I found a good offer and got my 1080Ti for 632€ o_O

you better off just getting 2 1070s for less on ebay and running them in sli
DAZ doesn't need SLI, but appreciate a lot 11GB of VRAM instead of 8GB (and more CUDA cores) :oops:
 

xht_002

Member
Sep 25, 2018
342
353
DAZ doesn't need SLI, but appreciate a lot 11GB of VRAM instead of 8GB :oops:
iray needs SLI, and makes rendering 2x faster using all the cuda cores for floating point math on light vectors and texture pixel shading

if you have 2 titan x cards in sli you can render in real time

 

Porcus Dev

Engaged Member
Game Developer
Oct 12, 2017
2,582
4,705
iray needs SLI, and makes rendering 2x faster using all the cuda cores for floating point math on light vectors and texture pixel shading

if you have 2 titan x cards in sli you can render in real time

I'm not going to say categorically no but...

As far as I read, it's recommended to not have SLI or just disable it to work better with Iray.

And what if I'm totally sure is that is possible to render using two card without SLI, just now I'm rendering 90 frames animation with 1080Ti + 1060 (GPU-Z shows two cards working at 99%).
 

xht_002

Member
Sep 25, 2018
342
353
I'm not going to say categorically no but...

As far as I read, it's recommended to not have SLI or just disable it to work better with Iray.

And what if I'm totally sure is that is possible to render using two card without SLI, just now I'm rendering 90 frames animation with 1080Ti + 1060 (GPU-Z shows two cards working at 99%).
i know, but SLI is still better, with the way it manages data, if you buy another 1080ti or 1060, you can download the cuda SDK from and monitor the cuda usage using
 

MaxCarna

Member
Game Developer
Jun 13, 2017
383
442
I was planning to upgrade my 2x 1070 for 2x 1080ti, but then I saw 2080ti taking the half of the time of a 1080ti to render the same scene. In this case the energy consumption would make the difference: 440w (2x 220w) Vs 250w.

I'm also feel that new drivers will take much more from the floating point and the ray tracing together with OptiX could really boost the render time.

But is a huge investment to run into this risk. I would have to transport to another country, it would be a torture to keep sending the card back to get a new one.

I also found the pricing to be very abusive.
 

OhWee

Forum Fanatic
Modder
Game Developer
Jun 17, 2017
5,889
29,926
Actually, the recommendation on the Daz forum is to disable SLI when using multiple cards.

I regularly render with dual 1080s, and yes it does essentially cut my GPU render times in half. And that's without SLI. To be honest, I'm not seeing a difference between SLI enabled or disabled, at least with rendering, but of course with SLI disabled, well I lose half my pixels on my screen if the computer I have puts the screen to sleep then re-awakens it. Essentially it's expecting SLI on the resume, but of course I disabled that, so everything looks blocky/blurry...

But yeah, I've seen some crazy combinations on the Daz forum r.e. multiple cards for Iray rendering. Say a 1070 + a 1080 Ti + a Titan V... they DO play nice together, and collectively can drop render times a LOT - as long as the scene can fit in the card with the least GPU memory that is.

Now, if you want to 'combine' your GPU memory into one larger block, well that requries NVLink, which is a completely different technology (although it's kinda the same). Most 'gaming' cards from the 10xx generation do not support NVLink. Not sure on the 20xx cards as to which ones have an 'active' NVLink connector thingie on the card for the NVLink widget.
 
  • Like
Reactions: Porcus Dev

Rich

Old Fart
Modder
Donor
Respected User
Game Developer
Jun 25, 2017
2,566
7,382
The 20xxx cards do NOT support "memory pooling." Although some of them have an NVLink connector, it's not really the same NVLink that the very high end cards that do support memory pooling use.
 
Jun 29, 2018
145
132
I knew upgrading to the 1080 Ti would really improve my 'final' render times, it did and I'm really loving that. What I didn't expect was how much everything else I do in Daz also improved. Doing quick spot renders to check lighting or even the speed at which scenes load has been great. Having my scenes load so much faster (and having 11Gb vs 6Gb) makes using Daz throughout the day so much more enjoyable.

I'm sure I'll get used to it in a week or so and want to eventually get a 2nd 1080Ti but I don't plane on spending that sort of cash yet. Let's hope they work out the issues with the 20XX cards and as production increases we see prices drop.
 

xht_002

Member
Sep 25, 2018
342
353
Actually, the recommendation on the Daz forum is to disable SLI when using multiple cards.

I regularly render with dual 1080s, and yes it does essentially cut my GPU render times in half. And that's without SLI. To be honest, I'm not seeing a difference between SLI enabled or disabled, at least with rendering, but of course with SLI disabled, well I lose half my pixels on my screen if the computer I have puts the screen to sleep then re-awakens it. Essentially it's expecting SLI on the resume, but of course I disabled that, so everything looks blocky/blurry...

But yeah, I've seen some crazy combinations on the Daz forum r.e. multiple cards for Iray rendering. Say a 1070 + a 1080 Ti + a Titan V... they DO play nice together, and collectively can drop render times a LOT - as long as the scene can fit in the card with the least GPU memory that is.

Now, if you want to 'combine' your GPU memory into one larger block, well that requries NVLink, which is a completely different technology (although it's kinda the same). Most 'gaming' cards from the 10xx generation do not support NVLink. Not sure on the 20xx cards as to which ones have an 'active' NVLink connector thingie on the card for the NVLink widget.
you will find VRAM does'nt get used for rendering, its mainly used for raw image sizes and frame buffering when playing games at 120FPS or more, with raytracing, the models and textures will be stored in VRAM, while alot of DDR RAM will be used for rendering

you can monitor VRAM use using FCAT

without iray assets which use cuda, alot of rendering is on the CPU as they are better for floating point math and actual real world physic's, which cuda and AMD stream processor's enhance, by a single processor doing a single job, which allows for 1000+ processor's on a single GPU card
 
Last edited:

Porcus Dev

Engaged Member
Game Developer
Oct 12, 2017
2,582
4,705
Can I use two different GPUS like 2080ti and 980ti in daz ?
Of course, but DAZ store the scene in VRAM of both graphic cards, so if you scene is for example 7GB, the scene will be rendered only with your 2080Ti because the scene doesn't fit in 980Ti VRAM; but if your scene is under 6GB, both card will work with the render.
 

xht_002

Member
Sep 25, 2018
342
353
Of course, but DAZ store the scene in VRAM of both graphic cards, so if you scene is for example 7GB, the scene will be rendered only with your 2080Ti because the scene doesn't fit in 980Ti VRAM; but if your scene is under 6GB, both card will work with the render.
so your better off having SLI and always have multiple cards working
 
Last edited:
S

SteelRazer

Guest
Guest
Memory pooling is possible since Pascal, but not like anyone here would manage that (maybe yes). Sli is dogshit as it's bandwidth is very low compared to NVlink, different on RTX 2080ti and 2080 ( 2080 bandwidth = 2080ti/2 ), still far more superior than SLI. Saw somewhere on developers thread it's possible to pool Turing VRAM only on Linux (don't remember if with or without NVlink). 2080ti is currently not worth as Optix prime in Daz is 5.0.1 and Turing need Optix 5.2 and higher, now it's faster to render without Optix prime enabled speaking from own experience, but still not boost compared to 1080ti which is working very well with Optix prime. Maybe when Optix prime will support RTX it will be 2x faster than 1080ti.
 
Last edited by a moderator:

Porcus Dev

Engaged Member
Game Developer
Oct 12, 2017
2,582
4,705
you will find VRAM does'nt get used for rendering
In terms of Iray render engine, the cuda cores take care of rendering process, but you need enough VRAM for the scene to load into it, if not the scene will be rendered with CPU intead of GPU.

For example today I rendered a scene with 7 characters occupying 10GB of VRAM memory, my 1080Ti took care of the rendering while my 1060 with its insufficient 6GB for this task was not working on it.

So the amount of VRAM memory is one of the most important things to keep in mind when you want to render on Iray.