Now the overclock-curious can buy a delidded AMD 9800X3D, with a warranty

I'm curious what people are doing that would require CPU overclocking at home. Does a 5-10% increase really make that much of a difference in a game if the other specs are the same?
No. And that's why I'm effectively out of the overclocking game, myself.

Gone are the days of pencil-locking Duron voltage to boot a 190MHz FSB on an SDR board, using BGA-mounted SDRAM to hit such ridiculous frequencies (42% over stock clock on RAM and CPU FSB).

AMD and Intel don't leave the kind of headroom they used to. If you combined a delid with a good water cooler, you might get a few hundred MHz, but I doubt you'd even get 10%. I know the 9800X3D has its cache mounted on the bottom and picked up some clock from doing so, but I'd still be leery about OC'ing it. The cache may now be a secondary heat path, but it's still a heat path.

Heavy workloads these days are all multi-threaded, and you'd just do better stepping up to a higher-end chip as opposed to trying to squeeze a lower SKU for extra frequency.

This is for people who want the best chance at the fastest chip because they want it, not because you'll pick up some enormous real-world advantage.
 
Upvote
8 (8 / 0)
This really is more about maximizing thermal coupling for cooling in order to increase the opportunistic boosting of the x3D chip. They're largely hitting their OC max out of the factory.

I'm sure you can get away with absolutely silly overclocking under liquid nitrogen, but for more normal use cases, you're going to get more performance by just using a larger/more efficient cooling system and high-performance thermal compound on the existing IHS, and then under-volting and tweaking other settings for stability and efficiency.

I took remember when CPU dies were naked straight from the factory. The big IHS problem really started with Intel a while back -- when they got caught using thermal compound under their IHS that wasn't even as good as mid-performing compounds sold for coupling the IHS to CPU coolers. Delidding seemed like a no-brainer for those. AMD upped the ante with Ryzen, not only for stock performance but with a better IHS/thermal interface material design that forced Intel to fall back in step. Today, most die-to-IHS coupling compounds are just fine for anything other than extreme/competitive overclocking, where CPUs are boosted to insane speeds for short periods just to reach high numbers. If all you want to do is run applications and games at maximum practical performance, delidding isn't really as necessary as it was for a (rather Intel-centric) while.
 
Upvote
9 (9 / 0)

cerberusTI

Ars Tribunus Angusticlavius
6,949
Subscriptor++
$230 extra for the modded CPU and warranty coverage is surprisingly reasonable. People spend more than that on water-cooling gear for what is ultimately smaller improvements.
I think they still missed the market though. This would be better sold as a barebones system with a good MB, and options for either the best Noctua air cooler or whoever is seen as the best in water cooling.
 
Upvote
-4 (0 / -4)

jandrese

Ars Legatus Legionis
13,444
Subscriptor++
I remember cranking my Pentium 75 up to 100Mhz by just swapping a couple of jumpers around and getting noticeably better framerate in Quake. These days you're hard pressed to have any visible advantage from an overclock. Most games end up bottlenecked on the GPU, media encoding is usually offloaded to the video card, and we're well past the point where you're going to see improvements in office software. The practical use cases have all but vanished so all that's left are the "enthusiasts" for whom it is the overclock itself that provides the entertainment, not what you do with it.
 
Upvote
4 (4 / 0)
Yeah. They specifically say they aren't testing for OC just stock functionality after delidding. Theoretically, you might not be planning to OC a delidded CPU and might just want better cooling for a silent PC build with a direct die water cooler or something along those line. The 9800X3d isn't too power hungry so it probably isn't really needed, especially if you undervolt with curve optimizer or apply a lower power profile, but there could still be some benefit.
Although, the chip will automatically sort of overlock itself (well, it is technically not an overclock I guess because this is the factory default, but…), improving performance if there’s thermal headroom, right?
 
Upvote
2 (2 / 0)

taxythingy

Ars Praetorian
471
Subscriptor
Yeah. The good old days of running dual Celeron 300A's. That was a ridiculously good system for the era at a pretty good price until the 300A got so popular the price went through the roof and they sold above MSRP.
The Core 2 Duo processors were pretty darn good value when overclocked by around 50%. I had an E6300 up from 1.8 GHz to about 3.2 GHz and an E6600 up from 2.4 GHz to 3.6 GHz.
 
Upvote
4 (4 / 0)

phred14

Ars Scholae Palatinae
673
Subscriptor
I don't overclock my stuff anymore. Baseline performance on decent chips has been good enough for a while now that I don't find it to be worth the hassle.

That said, if overclocking is going to happen, I am very much in the "it can't overheat if it's immersed in cryogenic liquid" school of thought.
A former employer fiddled around for a bit with phase-changing fluorocarbon cooling. I know it's not the cryogenic liquid of which you speak, but it's a step on the way there, and I suspect few hobbyists will have the proper chillers for cryogenic. Besides, ordinary chips, even mil-spec, are qualified for -55C to 155C, and cryogenics are too cold. It can be done, but it's not fall-off-a-log, and on consumer chips you're likely to find new race conditions that were never simulated or tested for because the temperature is so different.

Anyway, back to fluorocarbons. They found that the phase (evaporative cooling) change deposited impurities on the backs of the chips, eventually coating them, "insulating" them, and causing thermal runaway. Attempts to clean the impurities out did not meet with success, either. They did more with such things, but abandoned allowing the medium to evaporate and just used fluid cooling.
 
Upvote
11 (11 / 0)

koolraap

Ars Tribunus Militum
2,188
I miss the days of swapping the crystal on the motherboard for another one (can't remember if it was socketed or soldered). Upgrade the clock from 25Mhz to 40Mhz on an AMD DX4? OMG. One hundred and SIXTY megahertz! Mega!

Those were the days, get off my lawn. I think you're on my lawn, anyway. Where are my glasses?
 
Upvote
12 (12 / 0)

ranthog

Ars Tribunus Angusticlavius
13,580
What really gets my goat, as an Australian: I can buy the delidded one from TG/Der8auer and have it shipped from Germany and it would cost me less than current stock prices of 9800x3d on various Aus vendor sites. 😭
We found a new use case for this! Except your custom cooling solution due to the different height of the die probably would kill your savings.
 
Upvote
1 (1 / 0)

ranthog

Ars Tribunus Angusticlavius
13,580
No. And that's why I'm effectively out of the overclocking game, myself.

Gone are the days of pencil-locking Duron voltage to boot a 190MHz FSB on an SDR board, using BGA-mounted SDRAM to hit such ridiculous frequencies (42% over stock clock on RAM and CPU FSB).

AMD and Intel don't leave the kind of headroom they used to. If you combined a delid with a good water cooler, you might get a few hundred MHz, but I doubt you'd even get 10%. I know the 9800X3D has its cache mounted on the bottom and picked up some clock from doing so, but I'd still be leery about OC'ing it. The cache may now be a secondary heat path, but it's still a heat path.

Heavy workloads these days are all multi-threaded, and you'd just do better stepping up to a higher-end chip as opposed to trying to squeeze a lower SKU for extra frequency.

This is for people who want the best chance at the fastest chip because they want it, not because you'll pick up some enormous real-world advantage.
More importantly, AMD and Intel don't leave the headroom on the floor and instead give it to us in boost frequencies. Basically every chip is OC'd out of the factory the way we would OC back in the day. Everyone wins compared to when they tried to frequency lock CPU's that could run much faster.

Anymore you do it because you enjoy tinkering with your computer. If you do it for the joy, then it is worth the money. That is going to be the real answer. These days for normal users desktops even for gaming the really high end stock chips are probably unnecessary compared to a mid-end chip with a good cooler.

(There are work loads that do need more power than a desktop/gaming scenario, but those scenarios usually call for long term stability.)
 
Upvote
13 (13 / 0)

ArsPlebeian

Smack-Fu Master, in training
99
some 10K U160 SCSI drives (the 80 pin Seagate Cheetahs weren't that expensive relatively speaking), you could get pretty blazing performance (at the expense of sounding like a jet taking off when booting) for the time.
Thanks for the memories, forgot the name was Cheetah. I was a lucky kid to get some 10k SCSI drives in what I attempted to make into a gaming computer. Poor thing was quite loud when running in the 95° attic bedroom.
 
Upvote
4 (4 / 0)

Hymenoptera

Ars Scholae Palatinae
658
AMD and Intel don't leave the kind of headroom they used to.
More specifically: manufacturing processes improved and allows today for tighter tolerance, and software improved and permits today tailored "boosts", which is less fun but gives more people close to the full capabilities of the specific CPU sample they own.

we're well past the point where you're going to see improvements in office software.
Which is a real pity given how much faster office software used to be (but that's more to deal with (lack of) optimization than with CPU speed – and the subscription model gives optimization efforts no business justification as money keeps coming in regardless of improvements).
 
Last edited:
Upvote
5 (5 / 0)

Thegs

Ars Scholae Palatinae
780
Subscriptor++
I've always been curious about delidding my CPU. I've got a custom water loop for noise purposes, and it's easy to feel like there are gains to be made since my 5800X3D can seemingly get so hot so quick. Put ~122W through the CCD and it jumps up ~85C. Whereas I can push something like ~230W through the compute core of my 3090 and it never goes over 55C. It seems like there's something wrong here, and maybe the IHS is the issue, but the surface area of the CPU's CCD is only 74 mm2, compared to the 3090's 628 mm2. So just with head math really my CPU is dissipating almost 2W per mm2, compared to the GPU's roughly .333W per mm2. I suppose that's pretty good, all things considered, which makes me unsure how much more there is to be gained from delidding...
 
Upvote
5 (5 / 0)
If you want to experiment, turn on AMD Eco Mode and run Curve Optimizer in Ryzen Master. Set Eco Mode to 45W TDP
Remember, that poster is complaining about idle power, which is a notable weak point of AMD's particular chiplet choices. They use an I/O chiplet that's on a larger process and idles quite hot, particularly with PCIe4 southbridges (like the X570 for the older AM4 socket, as opposed to the cooler B550.)

The numbers that poster is quoting seem too high, but it's pretty common for any Zen 3, 4, or 5 chip to idle at around 25-30 watts, where Intel can often be under 10. It's one of the only areas where they still consistently win.
These days you're hard pressed to have any visible advantage from an overclock.
The way I figure: modern chips know more about overclocking themselves than I do. About the only thing I touch is Curve Optimizer, to set an all-cores negative 20 millivolts. It runs a little cooler that way, and boosts itself a little higher.
Put ~122W through the CCD and it jumps up ~85C.
You're probably getting suboptimal contact between the cooler and the CPU lid. You might try re-pasting it. When I did this last build, I used a 360mm AIO with Thermal Grizzly's Kryonaut (regular, not Extreme), using the X application method (a big X from corner to corner of the IHS), and have been getting fantastic results, typically ~75C under max load for long periods.
 
Last edited:
Upvote
10 (10 / 0)
Remember, that poster is complaining about idle power, which is a notable weak point of AMD's particular chiplet choices. They use an I/O chiplet that's on a larger process and idles quite hot, particularly with PCIe4 southbridges (like the X570 for the older AM4 socket, as opposed to the cooler B550.)

The numbers that poster is quoting seem too high, but it's pretty common for any Zen 3, 4, or 5 chip to idle at around 25-30 watts, where Intel can often be under 10. It's one of the only areas where they still consistently win.
Oh, I'm quite aware. I run a B550 myself, chosen for that reason. Idle power on AMD is higher. But, as you point out, it's 20-25W higher, and he's claiming 45W higher. I suspect that difference is down to things like PCIe 4.0, higher DRAM power consumption, possibly a bit from I/O, depending on storage choices.

Out of curiosity, I checked. Intel's idle is still much better, but the 285K is up to 12-13W from 4-6W according to THG. Interesting to see the impact of the chiplet approach hitting both companies.
 
Upvote
7 (7 / 0)
Back in the day, I paid extra for a Celeron 300 to be overclocked and tested at 450MHz. It was apparently a pretty lucrative business, because nearly all of them worked fine, so the person doing the OC testing wasn't working that hard. But that way I knew for sure that the chip would work at 450MHz, and it served me for um, probably a couple years. CPU upgrades were constant back then.

There was one flaw. Apparently Coppermines running at +50% clocks had one instruction that was unreliable. Some game came out that used that instruction, and got the reputation of being horribly unreliable, when in fact it was the chips at fault. But in that time frame, the OCed 450s were so common that the game took the heat instead of the chips.

I wish I could remember what the game was. It really took a sales beating when it didn't deserve one. I guess the QA team just didn't think to test with an overclocked processor.

Anyway, this kind of thing was a viable business even when it was easy. What Thermal Grizzly is doing here is not easy.
You are mistaken (or trying to revise history). The only reason Coppermine received negative attention was that Intel tried and failed to release a 1.13ghz model. The model was defective out of the box and had to be pulled. They tried too hard to clock things too high and it backfired. (wikipedia link, you'll have to read a bit for that tidbit: https://en.wikipedia.org/wiki/Pentium_III)

Most folks with Celerons, regardless of the underlying architecture, enjoyed those overclocks back in the day.
 
Upvote
1 (1 / 0)

Thegs

Ars Scholae Palatinae
780
Subscriptor++
You're probably getting suboptimal contact between the cooler and the CPU lid. You might try re-pasting it. When I did this last build, I used a 360mm AIO with Thermal Grizzly's Kryonaut (regular, not Extreme), using the X application method (a big X from corner to corner of the IHS), and have been getting fantastic results, typically ~75C under max load for long periods.
I've checked the paste print and it looks fine and even. The 5800X3D's frequency is capped and I'm able to sustain the cap indefinitely, which is really all I care about since there's no more performance I can squeeze out, only longevity at this point. I use IC Diamond paste, not sure how it rates against Thermal Grizzly's products.
 
Upvote
1 (1 / 0)
I may have the model name confused, but not the chip itself; it was the Celeron 300A OCed to 450Mhz that had the failing instruction. I don't remember what instruction it was. I'm not sure I ever knew.
I've been looking for this out of curiosity, haven't found it yet.

But I do believe you. This has happened from time to time. And it's believable that it would be a game. You have a chip that's stable at a given speed 99.999% of the time, but then a game does something odd, and suddenly there's a failure condition.

I've heard of it happening more recently, but don't recall the exact nature of the issue.
 
Upvote
1 (1 / 0)
You are mistaken (or trying to revise history). The only reason Coppermine received negative attention was that Intel tried and failed to release a 1.13ghz model. The model was defective out of the box and had to be pulled. They tried too hard to clock things too high and it backfired. (wikipedia link, you'll have to read a bit for that tidbit: https://en.wikipedia.org/wiki/Pentium_III)

Most folks with Celerons, regardless of the underlying architecture, enjoyed those overclocks back in the day.
As I recall, it wasn't really Coppermine itself that got the negative attention, back at that point in time. There were a few factors in play.

1). Intel CPU availability above certain clock speeds was poor (733MHz, IIRC, was the tipping point). The problem was the full speed L2 cache. Intel had to clock at full speed and AMD was using a clock divider for the K7.5 chips that hit 1GHz. This started to bite more in late 1999 / early 2000 as they raced to 1GHz.

2). RDRAM. There were VIA chipset available with SDRAM and good ol' 440BX, of course, but the angry discourse around RDRAM had already begun (i820 dropped in September 1999, Coppermine in October).

In the end, CuMine was a great core. I loved Tualatin -- built a desktop PC around it to use in the heat of summer in college, many many moons ago.
 
Upvote
1 (1 / 0)
I don't see the point of overclocking in 2025. You'll get maybe what, 4-5 % perf increase ?

And some software doesn't like it at all. I got the 5 % increase, but then Apex Legends would time out and crash constantly - had to bring it down to like 2 % increase for the game to work. 2-3 hours of tweaking for less than 1 FPS ...
 
Upvote
0 (0 / 0)

JudgeMental

Wise, Aged Ars Veteran
142
Subscriptor++
Yup. I remember those days when the only way to go dual core was dual socket systems. That's part of what made that dual 300A system such a killer system at the time. The Core 2 Duos and Athlon X2s were a big deal for making more than single core systems practical for most people.
Man, when I was a teenager, I inherited the server from my dad's office. It was a dual PIII 700Mhz with two SCSI's in RAID0. It even had a tape drive with spare carts. The server mobo has some compatibility limitations, but I found a Radeon video card that worked for it back when Radeon was ATI. It was a BEAST - did Folding@Home and SETI@Home, and absolutely stomped all over my friends. It was also a space heater. If my room got cold, I just fired up one of the aforementioned projects and wait twenty minutes.

I miss those days of computing, even with its warts.
 
Upvote
7 (7 / 0)
I've been looking for this out of curiosity, haven't found it yet.

But I do believe you. This has happened from time to time. And it's believable that it would be a game. You have a chip that's stable at a given speed 99.999% of the time, but then a game does something odd, and suddenly there's a failure condition.

I've heard of it happening more recently, but don't recall the exact nature of the issue.
I wonder if they're misremembering the intel 1.0x GHz fiasco where they released a factory-overclocked pentium to compere with the Athlon 1 GHz that actually worked.

Toms Hardware discovered the intel was not stable when they were testing it with linux compiles.

There was also the FDIV bug- https://www.tomshardware.com/pc-com...ead-a-math-bug-caused-intels-first-cpu-recall
 
Upvote
3 (3 / 0)
I wonder if they're misremembering the intel 1.0x GHz fiasco where they released a factory-overclocked pentium to compere with the Athlon 1 GHz that actually worked.
Nope. It was the Celeron 300A, running at 450MHz, that had the somewhat obscure instruction that failed on many (most? all?) chips.

I am not confused. I am not incorrect. I can't prove that I'm right, but I assure you that it is not any other chip. Well, there was at least one more Celeron model, a little faster, that also OCed to +50%, but I don't know if that later model was affected. I do know that mine was.
 
Upvote
3 (3 / 0)
I think this is better for those who want to go with a delidded CPU but want to outsource the work to someone who has a lot of practice doing so. But I wouldn't think they'd be new to over clocking and custom cooling.
Yeah, I'm also seeing this as a "leave it to the experts" situation. A good delidder is going to set you back like $60 even before you actually get enough practice in to the point where you feel comfortable popping the top off your fancy CPU.

I'd consider myself pretty capable with my hands when it comes to small fiddly things, and still there are many intricate artistic and technical processes that I've spent way more than $230 on to become anywhere close to proficient in. In fact, $200-ish is about the used asking price on eBay for those lower range AMD 9000 series chips you'd need to cut your teeth on. . . .
 
Upvote
3 (3 / 0)
I wonder if they're misremembering the intel 1.0x GHz fiasco where they released a factory-overclocked pentium to compere with the Athlon 1 GHz that actually worked.

Toms Hardware discovered the intel was not stable when they were testing it with linux compiles.
It was the 1.13GHz P3 that failed, not the 1GHz. But you are correct that Intel was looking for a marketing knockout blow on the Athlon that blew up on them. And then the i820 had to be recalled ~6 months later.

EDIT: The 1GHz P3 was in miserably short supply, IIRC -- AMD outsold it 6:1 or something like that -- but I don't recall (and can't find) any evidence of a problem with it.
 
Upvote
5 (5 / 0)

Random_stranger

Ars Praefectus
4,591
Subscriptor
Yeah. The good old days of running dual Celeron 300A's. That was a ridiculously good system for the era at a pretty good price until the 300A got so popular the price went through the roof and they sold above MSRP.

Yeah, I remember those days. I didn't actually BUILD one of those, because IIRC, I was on AMD at the time. 386, 486-33 upgraded to 486-dx4-133, P166 (OC to 200), K6-2-266, K6-3-450 (not 100% sure on those Ks anymore), Athlon 1600xp, ....
 
Upvote
1 (1 / 0)
It was the 1.13GHz P3 that failed, not the 1GHz. But you are correct that Intel was looking for a marketing knockout blow on the Athlon that blew up on them. And then the i820 had to be recalled ~6 months later.

EDIT: The 1GHz P3 was in miserably short supply, IIRC -- AMD outsold it 6:1 or something like that -- but I don't recall (and can't find) any evidence of a problem with it.
Thanks. I could remember it was 1.xx to "win" over the 1 GHz Athlon (which actually was 1.1 GHz) but not the exact clock speed, and a google search didn't turn up the details. Knowing the correct clock let me find the article - https://www.tomshardware.com/reviews/intel-admits-problems-pentium-iii-1,235.html
 
Last edited:
Upvote
2 (2 / 0)

bigcheese

Ars Praetorian
512
Subscriptor
I don't see the point of overclocking in 2025. You'll get maybe what, 4-5 % perf increase ?

And some software doesn't like it at all. I got the 5 % increase, but then Apex Legends would time out and crash constantly - had to bring it down to like 2 % increase for the game to work. 2-3 hours of tweaking for less than 1 FPS ...
Yup. The OC hay days of past mostly came down to the complete market dominance of Intel, were they could afford to artificially cripple their chips and still outperform all competitors.
In a competitive market, performance is never going to be left on the table like that.
 
Upvote
2 (2 / 0)

JaneDoe

Ars Scholae Palatinae
1,360
Subscriptor
Try to undervolt it.

I have a 7800X3D, and the way to go is to reduce the voltages. Less heat -> longer bursts till the maximum temperatures are reached and the cores throttle down. No risk of frying your chips, as you pump less energy in. Got me 15% more in Cinebench for free with only non-aggressive, standard BIOS settings.
 
Upvote
7 (7 / 0)

Penforhire

Ars Tribunus Angusticlavius
6,221
Has to be about a decade ago I saw a company working on 'micro channels' in the base of the silicon die to run coolant through. I think it was a phase change material at the time but it wouldn't have to be. I'm surprised it hasn't been more of a high-end commercial success for the same reason these de-lidded parts exist. Must be trickier to fabricate than they anticipated.
 
Upvote
2 (2 / 0)

lalaltcdata

Smack-Fu Master, in training
84
I have replaced Intel I5 6600k+GTX960+16GB DDR4 with AMD 7900X+NVIDIA 4600TI+32GB DDR5. My old system was using ~45W with just one browser started(a simple web page). My new system is using ~90W in this almost idle situation. Both systems have recent Windows versions Win 10 and Win 11. After years of progres, and using a recent litography process(6nm) AMD succeded to build a platform(cpu+chipset) that consumes almost double in idle. No reviewer is pointing to this stupid situation. Milions of computers consuming almost double the energy of what was consumed 10 years ago. I don't know how a recent Intel cpu/platform behaves, but AMD idle power usage is unbeliveble.
And computing power? the increase from your old cpu+gou to your new rig is almost sci-fi. To get the same kind of computing power using a lot of computers in the era of your previous rig, you would have needed a lot more power.
 
Upvote
1 (1 / 0)
Yeah, I remember those days. I didn't actually BUILD one of those, because IIRC, I was on AMD at the time. 386, 486-33 upgraded to 486-dx4-133, P166 (OC to 200), K6-2-266, K6-3-450 (not 100% sure on those Ks anymore), Athlon 1600xp, ....
That's a very familiar upgrade path, although mine differs in minor details (486DX-66 to P166, K6/2-300, K6/3-400 (OC to 450), Athlon 1GHz, AthlonXP 1600+)
 
Upvote
2 (2 / 0)
Try to undervolt it.

I have a 7800X3D, and the way to go is to reduce the voltages. Less heat -> longer bursts till the maximum temperatures are reached and the cores throttle down. No risk of frying your chips, as you pump less energy in. Got me 15% more in Cinebench for free with only non-aggressive, standard BIOS settings.
Can you recommend a good undervolting guide please? Preferably for MSi motherboards?
 
Upvote
1 (1 / 0)