Thanks, No AMD and at least 12 Gb of VRAM with 32 GB of RAM. Would a 16GB 4060 be better than the 12GB 3060?You need an Nvidia card, no way around it.
The more VRAM, the better. I started out with a 2GB GTX 1050 and yeah, you can make it work... but I wouldn’t wish that on anyone. You want brute force — more RAM means you can work on heavier scenes. I’m on a 12GB RTX 3060 now, runs beautifully. It's a bit outdated already, but still solid. Below 8GB? You're basically gambling. And that also means learning hardcore optimization just to survive.
System RAM? 32GB is ideal. I managed with 16GB for a while — doable, but tight. I even used 8GB when I started. It worked, barely. But again, don't put yourself through that.
Yes, absolutely. I have the 4060 Ti with 16GB and it handles everything I can throw at it. It's just not as fast as the 4080's and 4090's. But it's plenty faster than the older 30's and 20's. Problem is finding one these days, as I mentioned above.Thanks, No AMD and at least 12 Gb of VRAM with 32 GB of RAM. Would a 16GB 4060 be better than the 12GB 3060?
Might want to wait a minute. The 40 series of Nvidia GPU's have pretty much been discontinued and their replacements, the 50 series cards, do not work with DAZ and Iray yet. There's rumors of a patch or new version of DAZ soon that will let you use those new 5000 cards.
But basically, what you buy will be related to what your goals are. If you're just playing with DAZ as a hobby and not looking to do this professionally, then you can get away with an older GTX or RTX Nvidia card. I started with a 1050 Ti - 4GB and it was useless for 90% of my renders. VRAM is king with Daz and Iray only works with Nvidia and it's CUDA cores.
Like AllNatural just posted, a 12GB card is a nice start (and honestly, it's pretty much the bare minimum to render more than one Genesis character these days) - the vast majority of my production renders need ~11GB of Vram, but I sometimes nudge up to my 16GB max on my 4060 Ti.
Good luck finding any card over 12GB now, unless you search for used 3080's, 3090's etc. The 4060 Ti - 16GB I have; I bought new 1 year ago for $499 USD. If (And a big if) you can even find one still in stock somewhere, they are now running $1000 and up.
You can get away with CPU only rendering but be prepared for each render to take 2+ hours. Or you can work hard to split your renders up into multiple parts to render with an 8-12GB card and then stitch them together in post (Photoshop, etc).
Personally, I am gearing up for a new rendering PC and I am going to wait for DAZ to get the fix out for the 5000 cards first, then I'll probably buy a 5070 -16GB or higher.
One thing to mention - that doesn't get said often enough, IMO - is that you don't have to be focused on photo-realistic 3D renders, and thus be chained to Nvidia GPU's. For instance, there's new Filament toon style shaders in DAZ that let you do anime-like 2D style renders virtually instantly, with any GPU.
You can also focus on learning Blender instead. It renders fine with AMD cards, but Nvidia stuff is faster.
As for the rest of your PC... 32GB RAM minimum and a newer generation CPU. I'm still on an older 10th generation i3 and my viewport gets a bit laggy when I have a complicated scene. But basically anything 10th gen up with at least 4 cores works fine. You also want a good SSD at least. No spinners. DAZ deals with huge files and SSD's are the only way to go. Best option is an M.2 NVMe.
Yes, absolutely. I have the 4060 Ti with 16GB and it handles everything I can throw at it. It's just not as fast as the 4080's and 4090's. But it's plenty faster than the older 30's and 20's. Problem is finding one these days, as I mentioned above.
We're looking at the days of 32GB being 'more than enough' getting close to end. If you're using a 16GB GPU, just go with 64GB (especially if you're looking at DDR4). You'll thank yourself later when you're multitasking while rendering. Hell, some days 96GB of RAM doesn't even feel lie enough for me. CPU is kind of whatever, my last system was using a 10700 and was fine. I'm sure an equivalent Ryzen would the trick.32 GB of RAM
Ya, sure... the 3090 was the flagship. But it's also 5 years old and out of production for a while now.I would rather use a 24Gb 3090 than a 16Gb 4060. The convenience of not having to mess with optimizing trumps render speed for me. For the quality of your renders, the hardware doesn't matter. The hardware is all about render speed and how much you can stuff into a scene. With characters, clothes, hairs, and a scene with environments both in the foreground and background, I run into trouble at around 4-6 characters onscreen.
I think there's valid arguments for both, but with the prices as they are now, the value leans heavily in favor of a used 3090. Even looking passed the VRAM, the 3090 is going to remain relevant (by tech standards, at least.) quite a lot longer than the 4060 will. Heat is also valid, but obviously situational.If you're comfortable paying $700 USD and up for a used card with no warranty and no idea how it was used ... well, then you have more money then I have to play with. I only make large money purchases on items I can return under warranty. The 3090 also sucks up almost 200 more watts then the 4060 and puts out a huge amount of heat. Both of which are concerns for me, since I have my work station in my bedroom.
Not to poke at your workflow (especially if it works for you), but I feel like 6000 iterations is pretty high for 1440p. I tend to render in 4K at around 3500-4000 iterations, and that's probably still too much. For example:I'm rendering a 1440p frame right now and it's 14.1 GB of VRAM (including the ~1.1GB or so that Windows, Firefox and a few other apps are using). To get to 6000 iterations is going to take about 20 mins. This is an important render so I am doing it full frame to best quality. I'll then run post filters and shrink it to 1080p for maximum detail.
Ya, it's an abnormally high iteration count. I had some shadows that were stubborn with noise and this render is a gallery render so I was pushing it more than usual. Normally I render 1080p to 5 or 6000 and for the 1440p renders, I find 2500-3000 a sweet spot. And when I am just rendering characters alone to layer on a premade background, you sometimes only need 300-500 iterations. That's also usually what i use for animation frames. 500 iterations at 1440p and then run them thru the denoiser afterwards. I can usually keep the time down to about 1-2:00 per frame.I think there's valid arguments for both, but with the prices as they are now, the value leans heavily in favor of a used 3090. Even looking passed the VRAM, the 3090 is going to remain relevant (by tech standards, at least.) quite a lot longer than the 4060 will. Heat is also valid, but obviously situational.
Not to poke at your workflow (especially if it works for you), but I feel like 6000 iterations is pretty high for 1440p. I tend to render in 4K at around 3500-4000 iterations, and that's probably still too much. For example:
View attachment 4759532
I'd try playing around with your iterations and see if you can get it down a bit. It sounds like you could probably chop your render times down by at least a few minutes. But if it works for you, it works for you.
It's simple,Hi, I'm looking for recomendations from experienced Daz users in terms of what are the minimum or recommended GPU, RAM and CPU to make good loooking renders and animations.
I'll just politely disagree strongly with you on this one.PS: Buying a used card being bad is a myth, readYou must be registered to see the links(His rate of getting bad gpus used was about the same as my rate of getting bad new ones over my lifespan); You should make sure the card works (ideally buy it from someone who allows you to return it if it's broken or run some benchmarks at the seller's pc with the card or something); even if it was used for crypto-mining 24/7 for several years, it isn't necessarily any worse off than a card that's been in a regular gaming pc used for games one hour a day for the same amount of time; hardware doesn't degrade that way. If anything, if it was used for crypto mining for several years there's no better endurance test to prove that the card is super solid. Just make sure it is actually working first. Bring a laptop with an egpu tray or get the seller to leave the card in a pc for u to try it out before you buy it or something and run a benchmark like one of unigines thingies or something...
Generally speaking, buying a used gpu is a completely solid option, it's the gpu manufacturers that do not want you to do it, so they started this rumor and a lot of people bought it.
This isn't necessary at all (coming from someone who used a 10700 with 2 4090s for a bit), not with consumer/hobby software like Daz or Blender. In the case of consumer (non-Threadripper/etc.) chips, neither of these softwares care much at all about CPUs and actively prefer Nvidia GPUs. Even Maya is starting to lean into Nvidia.The most powerful cpu u can afford (intel or amd doesn't matter, amd is better tho imo; look up benchmarks n shit don't take my word for it)
Yeah I expected that; but there are quite a few cases where the cpu does matter in game development and in 3D design stuff as well. software rendering is occasionally used where the CPU is everything, and in game development a better cpu will help you greatly reduce compile times, which is also very nice unless you're using an interpreter powered engine like ren'py or html games.This isn't necessary at all (coming from someone who used a 10700 with 2 4090s for a bit), not with consumer/hobby software like Daz or Blender. In the case of consumer (non-Threadripper/etc.) chips, neither of these softwares care much at all about CPUs and actively prefer Nvidia GPUs. Even Maya is starting to lean into Nvidia.
From my experience, CPUs alongside a GPU slows things way down. You're just as good with a 11700 as you are with a 14900k.
Nvidia GPUs are the only option, and it's (currently) not particularly close. At all. AMD is just sad for anything that isn't gaming, and it's typically pretty bad at that, too.
Yes, heat causes degradation, voltage spikes and excess current indeed damage over time. But the worst of the heat damage is mitigated if the card has been well cooled for it's lifespan and cryptominers who know what they're doing tend to undervolt their cards both for lower temps and to mitigate those power spikes.I'll just politely disagree strongly with you on this one.
Speaking as an engineer who worked in the aviation field for almost 30 years, I spent a very large percent of my time at work monitoring and maintaining components based on set life limits. Either cycles (landing and takeoffs) or hours of use. EVERYTHING has a life limit and we had to track and replace almost every single component on an aircraft based on those limits.
It's laughable saying a GPU used in a crypto-mining farm might be better off then a GPU used for only an hour each day. Even in solid state components, heat causes degradation of the materials. So does voltage spikes and excess current. You really have no idea how hot the used card you bought had ever reached, or if it's ever had a power spike because some duffus tried to hotswap a card or something equally dumb. There's also the resellers who take used cards and then replace fans or heat pipes to make them look new, but the same concerns are there... the underlying silicon and components are original and possibly near end of life. It's especially important with the way they layer PCB's these days. Heat can cause circuit paths to bleed through the layers.
I'm not saying there are not good used cards out there. But you have to do the calculus on whether the seller is trustworthy, and what the return policy is and how much the savings are compared to a new card with a warranty. Personally, once I am buying anything over $300, I tend to stick with new only. Unless it's local and I can test the item first before committing.