The Ryzen 3900X Reviews are in... lots of content creation horsepower for $499 US

OhWee

Forum Fanatic
Modder
Game Developer
Jun 17, 2017
5,889
29,920
So, Ryzen 9 3900Xs went on sale this morning. And have already sold out at some places.

Short form, 12 cores/24 threads, AM4 socket (works in existing motherboards to a point with BIOS updates), more PCIe connectivity via the X570 chipset. $499 US retail price.

The 3950X is outperforming the 2920X 12 core Threadripper chip in most creation benchmarks. Of course, Threadripper has other advantages, but if you aren't planning on installing a lot of memory or don't need 64 PCIe lanes...

For those that rely primarily on Iray renders, the extra cores are less important. However, some of the newer X570 boards look like they could more easily accomodate multiple GPUs...

I should direct you to this Daz post about Iray benchmarking. Someone benched PCIe2.0 x16 vs x1, and saw virtually no difference:


My point in bringing that up is that 'settling' for even just 4x PCIe for your graphics card may not be a big issue if you are just doing renders with the card.


Of course, if you are doing a scene that can't fit in the graphics card memory, or are otherwise using a CPU based rendering engine, those 12 cores and 24 threads come in handy.

Anyways, whenever the CPU comes into play in Rendering, the 3900X is looking real good.

The new 8 core 7nm Ryzens look good too, but for those times when your renders go CPU only, you might appreciate having the extra cores. Otherwise, the 8 core 3700X looks good too.

The 3950X 16 core AM4 will be showing up a little later this year (September), for around $749 US, for even more AM4 goodness. Anyways, for those of us that aren't waiting on 7nm Threadripper, the 12 and 16 core 7nm AMD AM4 chips look quite promising. Just wanted to share.

Edit: edited the title. My brain was saying $499, but my fingers typed $399. Apologies!
 
Last edited:

nillamello

Member
Game Developer
Oct 11, 2018
198
626
I have been following this ever since the main announcement, and I'm still torn. I know my next system will end up being from AMD, but I'm primarily on Arnold as my render engine, so core count is a really big deal for me. I can't really justify upgrading to Ryzen 3 until I can get some benchmarks for the 16-core as well as the next generation of threadripper (but the higher individual boost clocks makes it a much more compelling option for viewport production work). And I'd also like to see some comparisons between those processors and the gen2 threadrippers (specifically the 32-core gen2 flagship). I'm not expecting the need for PCIe 4.0 in the near future (really fast storage is nice, but not essential), so if the last gen TR4 drops down in price, it could be a sweet dollar-per-thread deal.

Really need an upgrade for this janky old system... gimme leaks, AMD!
 

OhWee

Forum Fanatic
Modder
Game Developer
Jun 17, 2017
5,889
29,920
nillamello

I'm guessing that the 2990WX is out of your price range, although it's been coming down in price in recent months. Here's a review that compares the 12 core 3900X with the 2990WX and other Threadrippers:



Threadrippers in general have been dropping in price, so holding off until September when the 3950X drops might not be a bad strategy. The 2950X is already struggling against the 3900X, the 3950X should completely overshadow it, at least for rendering stuff.
 

nillamello

Member
Game Developer
Oct 11, 2018
198
626
nillamello

I'm guessing that the 2990WX is out of your price range, although it's been coming down in price in recent months. Here's a review that compares the 12 core 3900X with the 2990WX and other Threadrippers:



Threadrippers in general have been dropping in price, so holding off until September when the 3950X drops might not be a bad strategy. The 2950X is already struggling against the 3900X, the 3950X should completely overshadow it, at least for rendering stuff.
Luckily (or not so), nothing is currently in the budget for a few months, so I'll be waiting until Q4 no matter what. I figure that the 3950x is going to pummel pretty much everything except for the old WX in rendering. TR4 has always been about brute force, but it seems like the newest gen Ryzens are actually just all-around winners. But, then... what are the newest Threadrippers going to look like? If the rumored 64-core nuts-balls-crazy version is actually coming, I may need to mortgage my house for it.
 
  • Like
Reactions: OhWee

OhWee

Forum Fanatic
Modder
Game Developer
Jun 17, 2017
5,889
29,920
So, Wendell at Level 1 Techs and his friend have been doing some insane things and is mad scientist happy with the 3700X/3900X and Navi 5700...

First video kicks off with Wendell beaming about how he's running a 3900X on a $60 B-450 board, with 64 GB of RAM...
The tail part of that video is mostly about livestreaming though...

The second video talks a bit more about benchmarks, and PCIe4 NVME and how the PCIe 4 bus actually improves performance for the content creation workloads they talk about at some point.. Either way, yeah the 3700X and 3900X paired with a 5700 are doing quite well for content creation!




For those of us that are stuck in Iray land, we'd still need to get a decent Nvidia GPU instead, but the rest of the stuff still applies. For those of you that aren't tied to CUDA based rendering... might be worth a looksee...

These guys are giddy about Ryzen 3000, and are chomping at the bit for 7nm Threadripper, based on what the 3900X is doing already benchmark wise. As am I...
 
  • Like
Reactions: nillamello

nillamello

Member
Game Developer
Oct 11, 2018
198
626
I'll be interested in checking the video tomorrow, but idunno about Navi for production... Unless there is no chance at all of using a gpu render engine (I'm thinking octane, redshift, iray, pretty much all of the big guns are cuda exclusive, including the Arnold gpu beta), I think the production landscape is so heavily skewed towards cuda cores that there's almost nothing AMD can do other than create their own render engine that blows everything else out of the water (and it would have to be substantially and undeniably better, no less). The idea of effectively locking myself out of cuda rendering isn't appealing.
 

OhWee

Forum Fanatic
Modder
Game Developer
Jun 17, 2017
5,889
29,920
I'll be interested in checking the video tomorrow, but idunno about Navi for production... Unless there is no chance at all of using a gpu render engine (I'm thinking octane, redshift, iray, pretty much all of the big guns are cuda exclusive, including the Arnold gpu beta), I think the production landscape is so heavily skewed towards cuda cores that there's almost nothing AMD can do other than create their own render engine that blows everything else out of the water (and it would have to be substantially and undeniably better, no less). The idea of effectively locking myself out of cuda rendering isn't appealing.
AMD does have it's own render engine, ProRender. It's slowly seeing integration in a few rendering engines, but yeah Nvidia dominates the rendering world righht now.

For those rendering engines that support OpenCL based rendering, yeah the AMD cards are good for that, assuming that GPU based rendering is incorporated into the rendering engine. Blender also supports OpenCL based rendering as well as CUDA based rendering.

But yeah, I think Navi is cool and all but since I'm heavily invested in Daz Iray assets, I'm pretty much stuck in CUDA land myself. I'd like to buy the RTX Titan, but $2.5k for a video card is a hard pill to swallow. Makes me curious just how fast a CPU based render might be with a 64 core threadripper. If it's 'close enough'...

I like to build scenes that can exceed the GPU memory of even an 11GB Nvidia card, which is why more VRAM is attractive to me. Since VRAM doesn't stack in many cases (NVLink aside), yeah you kinda have to buy the card with the most memory that you can afford...

Anyways, I figured you'd be interested in the 7nm Ryzens at least. Or 7nm Threadripper later this year - I'm looking forward to an eventual Threadripper launch! Based on the Daz3D forum discussion I linked in the OP, though, a strong case can be made for 'settling' for the 3950x, since rendering doesn't really care very much about the number of PCIe lanes assigned to a card. Sure, load and finish times will take longer with x4 or less, but once the card is 'set up' with the scene tucked into the VRAM, most of the computation stays on the graphics card. The discussion I linked, the person saw a 1 second difference in overall rendering time between x1 and x16. It's a smaller scene, but yeah...
 

OhWee

Forum Fanatic
Modder
Game Developer
Jun 17, 2017
5,889
29,920
Just a fun note.

The MSI X570 Godlike has 4 PCIe 16 slots...



They will run at x8, x4, x4, and x4 in that config, but for rendering that may be just fine. The question is, whether four graphics cards could coexist on that board...

They aren't cheap, though. $699 US retail price...

I'm personally not a fan of MSI these days, thanks to my uber MSI laptop self bricking it's BIOS, but the 4 full length slots are still noteworthy.

I miss the old days, when motherboards came with a bunch of full length slots... Even if they ran at slower speeds, they were still full length slots.

Of course, back in the day, graphics cards had dedicated AGP slots at one point. Moving the graphics cards to PCIe definitely improved slot flexibility. ISA slots were a thing in those days too, but ISA was on the way out. But I digress.

Anyways, a bunch of full length slots seems to be more of a server thing these days.
 

gamersglory

Xpression Games
Donor
Game Developer
Aug 23, 2017
1,356
3,562
Here is the short of it if you are going to render a lot and use it for gaming it's a much better bargain then Intel's HEDT
I have an MSI Tomahawk B350 MB and still runs like a champ
 
  • Like
Reactions: OhWee