The Zen Thread

But they explicitly say it was:
That key was published not by AMD, but by NIST(?). So technically it is a "published key", but it isn't "AMD's published key", or "AMD published their key".

The difference here is that what the poor sould at AMD did was certainly stupid, but not quite as stupid as publishing their private key outright.
 
That key was published not by AMD, but by NIST(?). So technically it is a "published key", but it isn't "AMD's published key", or "AMD published their key".

The difference here is that what the poor sould at AMD did was certainly stupid, but not quite as stupid as publishing their private key outright.
Maybe? But obviously AMD gave NIST the key in the first place and then kept using it, so whether they actually published it or not seems more like splitting hairs than anything.

Once a private key isn't private anymore, you're supposed to stop using it, even if it's only a small exposure. And why ever give NIST a functional private key in the first place? The questions, they multiply.

So I'm still wondering why the exploit here is so much more complex than appears necessary. Are these guys trying to toot their own horn and make themselves look smarter than they actually are? Did it not occur to them to drop the idea of the whole exploit chain and just use the key to sign microcode directly?

edit: one thought.... maybe what was published was a hash of the private key?
 
But obviously AMD gave NIST the key in the first place and then kept using it, so whether they actually published it or not seems more like splitting hairs than anything.
We're still talking past each other. The presumed chronology is as follows:

1. NIST (or whoever) publishes reference documents about cryptographical standards. This document contains an example key.

2. AMD implements their microcode signing process. The person tasked with that job reads the reference document and cluelessly decides to use the published example key.

3. Some security researchers probe AMD's mechanism and implementation, and successfully attack it.

4. The same security researchers unexpectedly stumble over the fact that AMD was using an example key published in the reference documentation.

Does that sound about right?
 

Drizzt321

Ars Legatus Legionis
30,828
Subscriptor++
2. AMD implements their microcode signing process. The person tasked with that job reads the reference document and cluelessly decides to use the published example key.
Might even have been a "using example key for testing, TODO: generate and secure our own key", with the TODO never getting done for whatever various possible reasons.
 
AMD implements their microcode signing process. The person tasked with that job reads the reference document and cluelessly decides to use the published example key.
My impression was that AMD gave NIST the key, not the other way around. But in rereading the text very carefully, I think your interpretation is better, that AMD used an example key, rather than providing their key to NIST. So even dumber on AMD's part.

As an aside, that's a lot of why defensive security is so hard, because you have to get everything right. One intern makes one mistake, and you're toast.

So: yes, I think you're right. But we still haven't answered why there's this big song and dance around coming up with hash collisions, instead of just using the example key to sign evil microcode.

All the criticism about the poor choice of hash appears to be superfluous puffery. It doesn't matter what the hash was, when the private key was exposed.
 
So even dumber on AMD's part.
I quoted your statement because it seems so very important to you to repeat it again and again and again.

But we still haven't answered why there's this big song and dance around coming up with hash collisions, instead of just using the example key to sign evil microcode.
Because the researchers didn't know that this was the actual key until after they had broken the code like any other self-respecting attacker: the hard way.
 
Because the researchers didn't know that this was the actual key until after they had broken the code like any other self-respecting attacker: the hard way.
But then the whole exploit doesn't matter. It's not important anymore. It was just the way they found the actual vulnerability. They don't need to be doing any of their other shit with new keys, that's all irrelevant.
 
The algorithm was broken anyways so it hardly matters what key they used.
Without the key, the poor algorithm choice didn't actually matter. AMD should have chosen something else, but without the key material to also work with, even their weak hash method would not have been broken. And with the private key, it shouldn't matter anyway, it should be broken before the hash even comes into play.
 
You seem to be under the misguided impression that I am defending AMD. I am not. I do have some sympathy for the poor soul, though, who was in over their head, and ultimately caused the whole mess.
That comment was aimed at @NW's incorrect focus on the hash algorithm. It was a poor choice for the purpose, but its weaknesses wouldn't have mattered without the private key exposure.

It's kind of like complaining about the poor build quality of a house that's massively on fire.
 

Xavin

Ars Legatus Legionis
30,551
Subscriptor++
ASRock finally un-beta'ed 3.20 for my m/b so I just installed it just in case. I almost didn't given that I was stable but I did anyway.
I think given how software based CPUs are and how hard it is to brick a MB these days, the old conventional wisdom to never update your BIOS unless you are trying to fix a problem is probably obsolete. There are enough security vulnerabilities popping up everyone should probably update at least every six months or year. It goes against all my old computer tech intuition, but times change.
 
Last edited:
I think given how software based CPUs are and how hard it is to brick a MB these days, the old conventional wisdom to never update your BIOS unless you are trying to fix a problem is probably obsolete. There are enough security vulnerabilities popping up everyone should probably update at least every six months or year. It goes against all my old computer tech intuition, but times change.
MBR BIOS boards were dead simple. UEFI BIOSes are extremely complex pieces of software, and IMO need updates just like OSes do. They are an OS.
 
I think given how software based CPUs are and how hard it is to brick a MB these days, the old conventional wisdom to never update your BIOS unless you are trying to fix a problem is probably obsolete. There are enough security vulnerabilities popping up everyone should probably update at least every six months or year. It goes against all my old computer tech intuition, but times change.
I think it was around 2011 or so when updating a BIOS became common for me to do even if there was nothing wrong with the system, it was just based on the fact that the OEMs started to release a boatload of updates compared to the intermittent updates of the past, and usually when something was really wrong.
I have yet to brick a system doing an update, but the pulse was always very high back then, and kind of still is, probably because the fear of bricking anything has been deeply ingrained into me :)
 
I have yet to brick a system doing an update, but the pulse was always very high back then, and kind of still is, probably because the fear of bricking anything has been deeply ingrained into me :)
If you've got a board with the flashback feature, it doesn't need a working CPU, so even if you brick it, you can unbrick it again.
 
  • Like
Reactions: Baenwort
If you've got a board with the flashback feature, it doesn't need a working CPU, so even if you brick it, you can unbrick it again.
I know that :) But it is still a "muscle memory" from way back that is hard to shake. Come to think of it, i cannot even remember the last time someone bricked a motherboard by updating the BIOS or a computer.
 
I know that :) But it is still a "muscle memory" from way back that is hard to shake. Come to think of it, i cannot even remember the last time someone bricked a motherboard by updating the BIOS or a computer.
I've never yet done that in, um, I guess it would be about thirty years since BIOS updates became a thing? But I was still careful to buy this Zen 3 board with flashback, juuuust in case. (also because malwared BIOS files are becoming possible.)
 

Drizzt321

Ars Legatus Legionis
30,828
Subscriptor++
I know that :) But it is still a "muscle memory" from way back that is hard to shake. Come to think of it, i cannot even remember the last time someone bricked a motherboard by updating the BIOS or a computer.
About the time people stopped setting the jumpers for their FSB frequency and multiplier? :p

Yes, we're old.
 
  • Love
Reactions: Made in Hurry
Nice to see some CPU scaling tests with a RTX 5090 and some older CPU's at realistic resolutions (1440p & 4K):

TL;DR - the 5800X3D is on average 6% behind the 9800X3D, though 1% lows are noticeably lower.

It's 88% as fast at 1440p, where the CPU is more of the bottleneck than the card. And the lows are only 67% as fast.

That said, the average 1% low is 124.8, which is still pretty damn fast, more than my 120Hz screen here. With VRR, I suspect the hitch would be barely noticeable on a faster screen. I'm sure I wouldn't see it.

I'd love a 9800X3D, but this system is so solid that I don't want to mess with it. I'm happy that the CPU is holding up as well as it does. The in-place upgrade was super worthwhile.
 

Drizzt321

Ars Legatus Legionis
30,828
Subscriptor++
It's 88% as fast at 1440p, where the CPU is more of the bottleneck than the card. And the lows are only 67% as fast.

That said, the average 1% low is 124.8, which is still pretty damn fast, more than my 120Hz screen here. With VRR, I suspect the hitch would be barely noticeable on a faster screen. I'm sure I wouldn't see it.

I'd love a 9800X3D, but this system is so solid that I don't want to mess with it. I'm happy that the CPU is holding up as well as it does. The in-place upgrade was super worthwhile.
5800X3D was a magical upgrade, and major platform life extender. What's even more amazing, it's in my original X370 board! And with some of my original RAM.
 

sakete

Ars Scholae Palatinae
865
Subscriptor++
So as a current 5950X owner, it appears that any upgrade from here would both consume more power and generate more heat. I get more performance, yes, but I don't like that trade off. And I mostly use the power in games, occasionally for other intense processing purposes. What I'm seeing so far is that at least in games, the latest Zen chips provide only very modest to negligible improvements in terms of framerate (when running at 4K with a RTX 4090, which is my setup).

Am I correct in this assessment?
 
Some of the crap that motherboard manufacturers do is egregious, case in point.

Somehow the firmware was prompting Windows to re-enable Gigabyte's shitty updater service on boot, and you have to navigate to a secret menu to disable this behavior.

I have this board and while Gigabyte's hardware might be good, their UEFI interface is terrible even by the comically low standards of the industry.
 

Asral

Ars Scholae Palatinae
1,218
Subscriptor
Somehow the firmware was prompting Windows to re-enable Gigabyte's shitty updater service on boot, and you have to navigate to a secret menu to disable this behavior.
It's a Windows feature called "Windows Platform Binary Table" that has existed for years now, which is basically just a way for the BIOS/UEFI to include some software that Windows will automatically run on boot.

The intention is for this to be used only for critical software like important drivers, or stuff like anti-theft software. But of course hardware manufacturers will never miss an opportunity to abusively force install their crappy bloatware.

The first time I came across it was on the Asus board I bought above 5 years ago (luckily I spotted the option in the BIOS and googled what it was before I installed Windows), and I pretty much assume that all motherboard manufacturers do it. It is complete bullshit, but the only way around it is to remember to disable the option in the BIOS (if possible) or to not run Windows.
 
  • Like
Reactions: Baenwort
The first time I came across it was on the Asus board I bought above 5 years ago
What was particularly egregious there was that Armoury Crate was so bad that it caused the Zen 3 motherboards it was on to crash frequently. Removing that garbage software generally made the computers immediately reliable. It appeared that pretty much all Zen 3 ASUS boards were unstable with that software running, and then with the x570 hardware issues (severe USB problems when PCIe4 was enabled was probably the worst one), Zen 3 ASUS users were having a real bad time.
 

Asral

Ars Scholae Palatinae
1,218
Subscriptor
What was particularly egregious there was that Armoury Crate was so bad that it caused the Zen 3 motherboards it was on to crash frequently. Removing that garbage software generally made the computers immediately reliable. It appeared that pretty much all Zen 3 ASUS boards were unstable with that software running, and then with the x570 hardware issues (severe USB problems when PCIe4 was enabled was probably the worst one), Zen 3 ASUS users were having a real bad time.
It was pure luck that I noticed it early in the BIOS settings and looked it up. There were a couple of things I knew I wanted to change in BIOS before installing the OS, so I just looked through all the other pages while I was in there anyway and found a weird unexplained setting that just said "Armoury crate service" or something like that. A quick google showed it was something I had no interest in so I disabled it.

In my case it was Zen 2 (on B450) though, Zen 3 wasn't released at that point. I don't know if Zen 2 was also affected or not (don't remember seeing anything about when I installed), but either way I'm glad I didn't have to find out.
 
In my case it was Zen 2 (on B450) though, Zen 3 wasn't released at that point. I don't know if Zen 2 was also affected or not (don't remember seeing anything about when I installed), but either way I'm glad I didn't have to find out.
I don't know either. Zen 3 was where AMD first hit the big time, so that's when that horrible software started to get some real exposure.

I consider it radioactive, now. Even if it's better, I just do NOT install it.
 
The real crime is that it's 2025 and windows update still does not distribute most (or really almost any at all) motherboard updates for Windows. It's been able to securely distribute drivers since the late 1990s, but sure whatever put a firmware backdoor to load motherboard drivers and crapware. What could go wrong?
I can see not doing BIOS updates that way, but yeah, it's weird that motherboard drivers aren't in Windows Update. I wonder what the thinking is?
 

hansmuff

Ars Tribunus Angusticlavius
9,538
Subscriptor++
So as a current 5950X owner, it appears that any upgrade from here would both consume more power and generate more heat. I get more performance, yes, but I don't like that trade off. And I mostly use the power in games, occasionally for other intense processing purposes. What I'm seeing so far is that at least in games, the latest Zen chips provide only very modest to negligible improvements in terms of framerate (when running at 4K with a RTX 4090, which is my setup).

Am I correct in this assessment?
Depending on the game, you may be correct. I actually went from a 5950X to a 9800X3D. I realize you may instead be looking at a 9950X or wait for 9950X3D, since you mention more power/heat. But I want to lay this out to you:

One of the reasons for my upgrade is better 1% frame rates on 4K (4080S here), and sometimes much higher averages in some games I care about (for instance Baldur's Gate 3, some Factorio.) STALKER 2 ran okay-ish, definitely playable but it's a title where you flick the mouse around a lot and the 5950X had a hard time keeping the frame rate over 70 sometimes. It IS playable, of course, but it was a common theme with games I play where AVG were alright-ish (all in 4K) but lows were not great at all.
The 9800X3D in Stalker 2 mostly bumps against my 144Hz refresh and doesn't go under 110fps that I have seen, and the game feels so much smoother. BG3 is an outlier where the 9800X3D murders everything else downright (again, even in 4K.)
The benefits may be quite a bit higher on the 4090.

I initially bought the 5950X because I work from home, and I compile a lot of code, run large databases, some VMs, just a lot of concurrent tasks. The 16 cores seemed a natural fit. And really that was an awesome workstation that never flailed.

In another thread on ARS someone challenged my perceived need for the 2nd CCX with extra 8 cores and I thought of course I need them, to which they replied "test it." I forget the thread or I'd link it..

I disabled the 2nd CCX in BIOS and did my regular workloads. My specific code compilation load and I will stress this, code compilation is not a single entity, and some builds would definitely use those extra cores, but mine just did not. One CCX was just as quick as two.

One the 9800X3D I can run large databases on local MSSQL, VMs, Visual Studio etc etc perfectly fine (both builds are 64GB) without wishing for more cores. The 8, in my case, are totally fine for now. The improvements in IPC, clock speed and RAM performance completely make up, and surpass losses due to fewer cores, in my use cases.

I also looked at Compile benchmarks on Phoronix where you can find 9800X3D vs 5950X for Linux Kernel compile. Lo and behold, the 9800X3D is 10-ish% faster than the 5950X. I see similar benefits on my end. I'm already OK with the 5950X level perf there, so no concerns.

The 9800X3D uses WAY less power than the 5950X. My machines are always quiet, I buy the equipment I need to do so, but the 9800X3D is SO very easy to keep very cool. Any half-decent air cooler or AIO are absolutely sufficient.
That chip has only ever seen 80C under extreme stress, and the cooler (AIO in my case) still just laughed at it and kept the fans at ~1000 RPM. The 5950X uses more power; I still had it quiet but it took a lot more effort, and definitely used a good bit more power.

This is a long way to say, carefully evaluate your workloads. You may be surprised by the benefits even if going down half the cores, of course this is WILDLY YMMV.

But specifically for games, people love to bleat about how the GPU is the only limit in 4K. It is if you are only looking at Max FPS for heavily GPU intense workloads, like RT. I personally find the difference very noticable and it's not the sort of rose colored glasses, I always have a FPS indicator up.
 
Last edited:
  • Like
Reactions: sakete