AMD unveils Ryzen 9000 CPUs, debuts new Zen 5 architecture

Going from a 3900x to a 5900x with no real noticeable difference in PC speed within applications, is this going to be the same from 5900x to 9900x I wonder? Sometimes I wonder if it's better to forget typical upgrade routes and just do a system wide update once a decade!
 
Going from a 3900x to a 5900x with no real noticeable difference in PC speed within applications, is this going to be the same from 5900x to 9900x I wonder? Sometimes I wonder if it's better to forget typical upgrade routes and just do a system wide update once a decade!
Zen 6 will be a total redesign. Zen 5 is just Zen 3++. Also rumoured to have a new IO die capable of faster ram speeds by then also and double the core count.
 
Zen5 is the redesign. That's exactly why the clock rates don't need bumped, yet it still gets a good speed up. Zen6 will probably be a clock rate bump ... although I could see a change in the L3 caching happening too. There is a desire to resolve the problem around using multiple V-cache dies.
 
Last edited:
Everyone's looking at Zen 5 details while what I'm thinking is: "How much will the price of the 7800X3D + MB decrease as a result of all this?"
Same here. I'm still waiting for decent price for a x650e/x670e board, under $200. The 7800x3d under $300 is my tipping point.
 
I wouldn't say that. They lowered the TDP of most of the CPUs instead of going the Intel route where they try to max power draw. This is a win in my book.
But why? You can manually limit the TDP's yourself, it takes like 3-5 seconds.That's a nothingburger
 
Going from a 3900x to a 5900x with no real noticeable difference in PC speed within applications, is this going to be the same from 5900x to 9900x I wonder? Sometimes I wonder if it's better to forget typical upgrade routes and just do a system wide update once a decade!
Must be a pretty light application if you saw 0 improvement. In your case, yeah dont bother upgrading.

for me, the move from 3700x-5800x-5800x3d all produced noticeable differences. The 5800x3d is incredible.
 
But why? You can manually limit the TDP's yourself, it takes like 3-5 seconds.That's a nothingburger
Because this means you get the quoted performance at the factory TDP, unlike Intel where you can lower the TDP yourself and promptly get LESS than the advertised performance.

I don't usually worry about power draw one way or another, myself, but in view of Intel's high-end 13/14 series chips literally burning out their silicon prematurely and malfunctioning because the modes that bring their originally advertised performance try to shove 250 watts through the chips would be enough of a reason for me to actually take TDP into consideration. Keep 100% performance and fry my CPU investment early, go to 95% performance and keep the chip for longer, or just pick AMD, keep 100% performance without needing to worry about the power and probably outperform the Intel chip anyway? Not even a contest for me lol
 
But why? You can manually limit the TDP's yourself, it takes like 3-5 seconds.That's a nothingburger
As Adam said, limiting TDP can and will lower max performance. And it will still use more power than AMD.

Most just people can't do that, it's well beyond their tech knowledge (and I'm talking about the vast majority). I just spent 2 hours in a call trying to explain to somebody how to connect their new smartwatch to their phone and still didn't manage to do it because he could not complete the "I'm not a robot" mini game and got it wrong too many times...
 
Because this means you get the quoted performance at the factory TDP, unlike Intel where you can lower the TDP yourself and promptly get LESS than the advertised performance.

I don't usually worry about power draw one way or another, myself, but in view of Intel's high-end 13/14 series chips literally burning out their silicon prematurely and malfunctioning because the modes that bring their originally advertised performance try to shove 250 watts through the chips would be enough of a reason for me to actually take TDP into consideration. Keep 100% performance and fry my CPU investment early, go to 95% performance and keep the chip for longer, or just pick AMD, keep 100% performance without needing to worry about the power and probably outperform the Intel chip anyway? Not even a contest for me lol
Why does "quote" performance even matter? Yes if you get a 125w chip (like the 7700x) and limit it to 65w it will be slower, but it's not any slower than if the 7700x was released at 65w originally. So what difference does it make whether the 9700x is a 65w chip or a 125w chip? You can limit it at whatever value you like.


Your comparisons are just silly though. You can buy a 14900 non k or a 14900t and get 100% of the performance at 35 watts. There you go. You just have a very weird and unreasonable way of looking at things. If 90% of chip A performance is higher than 100% of chip B performance then obviously I'd get A and run it at 90%. Why would I even consider B?

It reminds me of an argument I had with a friend, he kept saying that his 7700x was better than my 12900k cause my CPU drew a lot of power in MT workloads. And I was like, I can limit it to 125w and it will still be faster than yours in those workloads. And his reply is exactly like yours "oh yeah,but you don't get the advertised performance". Well who cares, the lower than advertised performance is still faster than your CPU so??
 
Last edited:
As Adam said, limiting TDP can and will lower max performance. And it will still use more power than AMD.

Yeah, but if the CPU is limited to 65w tdp it's already "slower" than if it was released at 125w. What practical difference does it make to you what TDP AMD decides to release their CPUs? I just don't get why people care about that.
 
Ryzen has been outselling Intel in the DIY desktop market for a while now.
That's rather like boasting a lead in the home surgery DIY market. The OEM & VAR markets encompass the vast majority of revenues -- and do so without the hassles and high return rate of DIY sales.

At this point AMD has an infinite advantage in AVX512 performance on consumer processors
This is hardly the disadvantage you portray it. Intel w/ AVX-256 beats AMD w/ AVX-512 on many workloads. And in any case, AVX is a dead-end technology -- wide SIMD workloads should be being performed on the GPU at this point, not the CPU.

I say this despite my using exclusively AMD CPUs on my home machines ... but I really don't see AVX being anything but a legacy-support component within the very near future.
 
This is hardly the disadvantage you portray it. Intel w/ AVX-256 beats AMD w/ AVX-512 on many workloads. And in any case, AVX is a dead-end technology -- wide SIMD workloads should be being performed on the GPU at this point, not the CPU.

I say this despite my using exclusively AMD CPUs on my home machines ... but I really don't see AVX being anything but a legacy-support component within the very near future.
I would like to see examples about that one.

As for AVX generally, AVX has huge latency advantage against GPU. Something GPUs are not likely to offset for a long time. Intel also develops AVX further and ARM also have their own vector instructions.
 
That's rather like boasting a lead in the home surgery DIY market. The OEM & VAR markets encompass the vast majority of revenues -- and do so without the hassles and high return rate of DIY sales.


This is hardly the disadvantage you portray it. Intel w/ AVX-256 beats AMD w/ AVX-512 on many workloads. And in any case, AVX is a dead-end technology -- wide SIMD workloads should be being performed on the GPU at this point, not the CPU.

I say this despite my using exclusively AMD CPUs on my home machines ... but I really don't see AVX being anything but a legacy-support component within the very near future.
"Intel w/ AVX-256 beats AMD w/ AVX-512 on many workloads." - I don't know what workloads you are talking about, but in general that should not be the case.
 
Ryzen has been outselling Intel in the DIY desktop market for a while now. it's why they've been gaining market share a lot in recent years. It hit a record high of 23.9% in Q1 2024, up from 19.8% in Q4 2023 (Mercury Research).

Looking at amazon's top CPUs (US amazon) AMD currently holds the top 8 positions, with Intel having 9 and 10. We also have reports of AMD CPUs massively outselling in Europe and Asia.

Mindfactory sales show AMD above 90%:

As for mobile market share, that also grew a lot because of how bad Intel dropped the ball with Meteor Lake. It's up from 16.2% to 19.3%.

A few years are not enough to offset such a large difference, but if Intel's next generations are not good then AMD will eventually surpass it.

The worst part for Intel is that AMD now has 23.6% market share in servers (up from 1.3% in just 6 years), with a massive 33% revenue share.

Intel's only saving grace is that OEMs are slow to transition away from Intel. They barely started adding proper AMD systems to their portfolio in last 2-3 years. I'm not sure if Intel is "sweetening" the deal for them or not (like they did in the past), but the slower than expected sales for this market is showing a positive effect for AMD recently.
Should I click, should I not click on the x-marks-the-spot link...puiule.
 
we don't know that yet. let's wait for third party testing :)
See as AMD shared gaming benchmark vs Intel in and what this website provides benchmark for 7800x3d vs intel 14900k you can see clearly ryzen 9950x will smash 7800x3d.
2024-06-02-image-20.jpg


Cyberpunk 2077
9950x > 14900K by 13%
7800x3d:189 fps
14900K: 169 fps
9950x: 190 fps

Horizon Zero Dawn
9950x > 14900K by 23%
7800x3d:272 fps
14900K: 243 fps
9950x: 298 fps

F1 23
9950x > 14900K by 16%
7800x3d:236 fps
14900K: 230 fps
9950x: 266 fps


Link of benchmark
7800x3D VS 14900K
 
Last edited:
Why does "quote" performance even matter? Yes if you get a 125w chip (like the 7700x) and limit it to 65w it will be slower, but it's not any slower than if the 7700x was released at 65w originally. So what difference does it make whether the 9700x is a 65w chip or a 125w chip? You can limit it at whatever value you like.


Your comparisons are just silly though. You can buy a 14900 non k or a 14900t and get 100% of the performance at 35 watts. There you go. You just have a very weird and unreasonable way of looking at things. If 90% of chip A performance is higher than 100% of chip B performance then obviously I'd get A and run it at 90%. Why would I even consider B?

It reminds me of an argument I had with a friend, he kept saying that his 7700x was better than my 12900k cause my CPU drew a lot of power in MT workloads. And I was like, I can limit it to 125w and it will still be faster than yours in those workloads. And his reply is exactly like yours "oh yeah,but you don't get the advertised performance". Well who cares, the lower than advertised performance is still faster than your CPU so??


My nearly 2 year old 7700x is superior because it sits on the AM5 platform which means upgrading to the 9800X3D later this year is a 30 minute thing. After that, I plan on upgrading my XTX to a new multi-chip rdna5 gaming card.

CPUs are only one part of a system. The backbone and motherboard you choose to build your system matters more than the CPU you buy at the time. I've won ovr many of my gaming budz as years ago several went threw 3 different motherboard during the same 6 years I went through one AM4 system (1700x ~3900x ~5800X3D).

Tearing apart your Intel system every year to stay on top for gaming sucks.
 
See as AMD shared gaming benchmark vs Intel in and what this website provides benchmark for 7800x3d vs intel 14900k you can see clearly ryzen 9950x will smash 7800x3d.
2024-06-02-image-20.jpg


Cyberpunk 2077
9950x > 14900K by 13%
7800x3d:189 fps
14900K: 169 fps
9950x: 190 fps

Horizon Zero Dawn
9950x > 14900K by 23%
7800x3d:272 fps
14900K: 243 fps
9950x: 298 fps

F1 23
9950x > 14900K by 16%
7800x3d:236 fps
14900K: 230 fps
9950x: 266 fps


Link of benchmark
7800x3D VS 14900K
I'm still waiting for third party results. I agree Zen5 should on paper be faster, but these are first party results which could be cherry picked and with weird benchmark settings.
 
Back