2080 Super or 2080TI

CulayTL

Sneaky Bastard
Donor
Game Developer
Jan 31, 2018
4,385
35,622
Thinking to upgrade my GPU from 1080 to 2080 Super or 2080 TI

So the price difference in UK for this GPUs it's starts from £300-400.. around £700 for the Super and £1100 the for a decent TI.

1.png

Does the extra £300 - £500 worth it for the extra memory and cuda cores?


£1200 in UK because you know.. 1 dollar = 1 euro = 1 sterling pound
 

Walg

Visual art is my magnet. Currently inactive
GFX Designer
Donor
Oct 5, 2018
1,394
3,851
Thinking to upgrade my GPU from 1080 to 2080 Super or 2080 TI

So the price difference in UK for this GPUs it's starts from £300-400.. around £700 for the Super and £1100 the for a decent TI.

View attachment 374487

Does the extra £300 - £500 worth it for the extra memory and cuda cores?


£1200 in UK because you know.. 1 dollar = 1 euro = 1 sterling pound
Back it up a bit first - What do you think you'll be using your GPU for? I mean if you're solely gaming then there's a pretty big argument to save the £ for something else...
 

CulayTL

Sneaky Bastard
Donor
Game Developer
Jan 31, 2018
4,385
35,622
Back it up a bit first - What do you think you'll be using your GPU for? I mean if you're solely gaming then there's a pretty big argument to save the £ for something else...
85% render and rest for the gaming.. for the gaming I'm still happy with the 1080 for the games I'm playing atm.. mostly Battlefield with all settings on low and Mesh Ultra for better visibility.. not a fan of motion blurs and other effects.. same settings from BF3 to BF5
Getting a 120 fps on 1440p.. more than happy with that... and the ray tracing.. don't wanna hear about it again.. looks well but not on shooter from my POV
 
Last edited:

Yustu

Member
May 22, 2018
233
290
For DazStudio difference between 1080Ti and 2080Ti is around 80-95% due to RTX being supported by iRay.
Blender also plans to support (if not yet) RTX so performance gains will be there over time. I think also other 3d renders will fallow this trend.

For gaming, IMO, RTX is not worth extra $.
 

CulayTL

Sneaky Bastard
Donor
Game Developer
Jan 31, 2018
4,385
35,622
Yeah... my bad.. didn't mentioned that because I asked this on f95zone "development and art" section.. sry!
 

Walg

Visual art is my magnet. Currently inactive
GFX Designer
Donor
Oct 5, 2018
1,394
3,851
85% render and rest for the gaming.. for the gaming I'm still happy with the 1080 for the games I playing atm.. mostly Battlefield with all settings on low and Mesh Ultra for better visibility.. not a fan of motion blurs and other effects.. same settings from BF3 to BF5
Getting a 120 fps on 1440p.. more than happy with that... and the ray tracing.. don't wanna hear about it again.. looks well but not on shooter from my POV
I guess it depends on how much you value the extra VRAM. That would be the biggest thing you'd want (And the extra cores but not as valuable as the VRAM) as there's going to be additional ray tracing support for Blender/Octane coming out eventually. Value's something subjective imho...

Don't forget there's been a few stories of people melting their 2080's from DS rendering so that's also another thing to consider (Let alone 2080 TI's).
 

Yustu

Member
May 22, 2018
233
290
Update: Blender does support RTX:

What I would like to see is how well that HW denoiser works in blender (does it improve image quality over already existing one ?)

As for melting GPUs this is known issue with initial batch of GPUs. It happened also while gaming etc. Dunno what's current status of RMA rate.

There is also another route... wait for next gen / Ampere ?. Reason for this is pricing. It's yet to be seen if current 1.2k USD mark is here to stay or is it new tek tax ??? But personally I doubt that NV will lover prices...
 
  • Like
Reactions: CulayTL

Walg

Visual art is my magnet. Currently inactive
GFX Designer
Donor
Oct 5, 2018
1,394
3,851
Update: Blender does support RTX:

What I would like to see is how well that HW denoiser works in blender (does it improve image quality over already existing one ?)

As for melting GPUs this is known issue with initial batch of GPUs. It happened also while gaming etc. Dunno what's current status of RMA rate.

There is also another route... wait for next gen / Ampere ?. Reason for this is pricing. It's yet to be seen if current 1.2k USD mark is here to stay or is it new tek tax ??? But personally I doubt that NV will lover prices...
Better to denoise in post rather than let Blender/DS do it.

Well it's still quite high since on load 2080's 100% voltage runs at 84 degrees C at stock so without Afterburner etc there's still a bit of a chance that 2080's melt.

Depends if CulayTL wants to wait or wants to buy now?
 

Walg

Visual art is my magnet. Currently inactive
GFX Designer
Donor
Oct 5, 2018
1,394
3,851
I'm not in a hurry and mostly I'm still a newb and learning stuff atm.
If it were me, if you're doing this as a hobby and not to become a dev in the future, then if you had nothing else to do with the £300 - 500 difference then sure go ahead but that's like a number of month's worth of Tesco/Sainsbury/ASDA/Iceland/Waitrose/M&S groceries so....

If you're willing to wait and willing to use Blender rather than just DS/Octane then you'll open up more options later on but you'll need to wait a while since ProRender hasn't got as good of a reputation as DS/Octane...
 
  • Like
Reactions: CulayTL

khumak

Engaged Member
Oct 2, 2017
3,832
3,869
To me it would be a choice between the 2080Ti or the 2070 Super. The 2080 Super just seems like a very minimal bump in performance at a large price increase. Either bite the bullet and get the 2080Ti or if you want to save some money get the 2070 Super IMO.
 

Yustu

Member
May 22, 2018
233
290
Better to denoise in post rather than let Blender/DS do it.
...
What I meant was mostly interactive denoise for preview, but I don't know if interactive denoise will be supported.
IMO rule of thumb is to always output "RAW" material from renders and later on refine especially for all post processing effects, so indeed for final renders turning off denoise would be the way to go.

CulayTL

IMO If You can afford 2080Ti then go for it, but as stated above, not for gaming for RTX support in render engines.
 
  • Like
Reactions: CulayTL

HopesGaming

The Godfather
Game Developer
Dec 21, 2017
1,705
15,377
Vram (for me personally) far outweighs anything else.
So I would take even a 1080ti instead of the 2080 super.
 
  • Like
Reactions: Porcus Dev

MaxCarna

Member
Game Developer
Jun 13, 2017
383
442
I believe that the difference between the number of CUDA/Tensor cores don't worst the difference, you are paying for the additional 3gb VRAM. In my case, I was using 2x 1070 and sometimes 8gb weren't enough to process my entire scene, so I picked up a 2080 ti.

With Daz 4.11 we still don't have the RTX extra power working, only on 4.12 beta. Users are reporting a 40% improvement on render time due this change, between 4.11 and 4.12. I'm in a critical moment in my project right now, couldn't tested the beta yet to ensure that.

And it is true, the heat from 2080ti really bother me, I got a Galax dual fan model. I have 7 fans on the cabinet, 3 cheap and 4 Riotoro, I set GPU fans to 92% before render and still the card is always around 84º. If I could I would take a model with water coolers.
 

MovieMike

Member
Aug 4, 2017
431
1,662
I'd echo much of what people are saying, but I render on a 1080 and a 1070 and I'm generally happy with my performance. Like you, I'm mainly doing it as a hobby and working towards making content, but not quite there yet. I'm skipping the cards for now because the new amphere cards are rumored for mid 2020 or so and I plan on upgrading then, not just for the massive render increase, but because I want to run Cyberpunk 2077 on some decent settings at a nice frame rate and I imagine you'll need a pretty powerful card if you're hoping to play it in 4k.
 

recreation

pure evil!
Respected User
Game Developer
Jun 10, 2018
6,327
22,777
Honestly, if you have the money for these cards, you could as well just buy a RTX Titan...
 

pat11

Well-Known Member
Jan 1, 2019
1,365
7,530
Only thing you need to check for daz, Cuda cores, more cores, faster renders, that's all.
 

Porcus Dev

Engaged Member
Game Developer
Oct 12, 2017
2,582
4,705
I believe that the difference between the number of CUDA/Tensor cores don't worst the difference, you are paying for the additional 3gb VRAM. In my case, I was using 2x 1070 and sometimes 8gb weren't enough to process my entire scene, so I picked up a 2080 ti.

With Daz 4.11 we still don't have the RTX extra power working, only on 4.12 beta. Users are reporting a 40% improvement on render time due this change, between 4.11 and 4.12. I'm in a critical moment in my project right now, couldn't tested the beta yet to ensure that.

And it is true, the heat from 2080ti really bother me, I got a Galax dual fan model. I have 7 fans on the cabinet, 3 cheap and 4 Riotoro, I set GPU fans to 92% before render and still the card is always around 84º. If I could I would take a model with water coolers.
On temperatures, the brand and assembly of the card can also make a difference.
In one of my computers I have two 1080Ti. The MSI model, Armor, when used at maximum load, reaches 84º in seconds and can reach a maximum of 90º (I also have to say that it is the card that is on top of the other, so there is a small separation with the card below and this can also influence); instead, the other card, a Palit GameRock, at maximum load after hours, remains quietly at 64º ... the difference is that the Palit has twice the height of the MSI as it has a huge heatsink.
(Note: It would be impossible to have 2 Palit in my computer connected directly, being its height so large, if I connect one in the first slot PCI-E the second would be covered. For those who want to be able to connect several GPUs and have them well cooled and with space between them, there are solutions like: )

Only thing you need to check for daz, Cuda cores, more cores, faster renders, that's all.
That's right! But VRAM is just as important because if the scene can't be loaded into it, it won't be GPU-rendered, so the more, the better.
Although the models with the most VRAM are the ones with the highest gamma, so they will also be the ones with the most CUDA cores :)

But let's imagine for a moment that the new models of the 20xx series optimize enough, and that they make the 2080 Super (with its 8GB) render faster than the 1080Ti (with its 11GB)... Which would I prefer? Well, I would keep the 1080Ti because I won't suffer to see if the scene fits or not in its 11GB and because I render at night and I don't care if it takes a little longer :p (either this or 2080Ti... but having had 11GB, having previously a 1060 of 6GB, I don't go back, not under 11GB :ROFLMAO:)