Leaked GeForce RTX 5060 and 5050 specs suggest Nvidia will keep playing it safe

8GG, really?

It is funny that the RTX3060 created a real optics issue for NVidia. It is hilarious that they knew they could not get away with 6GB again as they had for the RTX2060 so instead of increasing the bus doubled the VRAM such that the 60 class had more VRAM than the 80 class.

My old RX480 had 8GB of VRAM back in the day, seeing cards today that gimped is just BS, but then NVidia only appears to care about AI and the 90 class cards at the moment as biggest markup
 
Upvote
3 (3 / 0)

evan_s

Ars Tribunus Angusticlavius
6,386
Subscriptor
8GG, really?

It is funny that the RTX3060 created a real optics issue for NVidia. It is hilarious that they knew they could not get away with 6GB again as they had for the RTX2060 so instead of increasing the bus doubled the VRAM such that the 60 class had more VRAM than the 80 class.

My old RX480 had 8GB of VRAM back in the day, seeing cards today that gimped is just BS, but then NVidia only appears to care about AI and the 90 class cards at the moment as biggest markup

Yeah. The connection between bus width and practical memory capacity always produces some odd points when you have bus widths that aren't a strict doubling of the previous one. The 3060 was the 192bit bus of that generation with the 3060ti and 3070 stepping up to 256bit bus. Good for the memory bandwidth but left them with less ram. Doubling that up to 16gb on those two cards would have given them more memory than anything that generation but the 3090/3090ti which wouldn't really have gotten rid of the odd memory sizing for that generation. Just changed where it was. The in-between size chips, 3gb instead of 2 or 4, help out because the 3060ti and 3070 could have been 12gb then but you still would have ended up with the odd step down to the 3080 at 10gb.

The 40 and 50 series have the same thing with the 192bit bus on the 70 series cards. They get stuck at 12 gb while there are 16gb cards above and below them. Again, pushing them up to 24gbs would be good for the 70 cards but then looks odd when the 80 cards only have 16gb.
 
Upvote
2 (2 / 0)

jandrese

Ars Legatus Legionis
13,445
Subscriptor++
I was about to say "no, the second release of the RTX 3050 reduced the TDP, but it's still not bus-powered", but then I looked it up and found there are three releases of the 3050:
  • First release, Jan '22, 8GB memory, 130W TDP;
  • Second release, Dec '22, 8GB memory, 115W TDP;
  • Third release, Feb '24, 6GB memory (and crippled bus width), 70W TDP
So yeah, you're right. Pity it took two years to come out after the original 3050 (why I missed it) and that it's so crippled - NVIDIA could clearly do better within the constraints of bus-power, just from looking at the mobile RTX chips.
Why not just have a mobile chip on a card? Bill it as a 4050 ECO or something like that.
 
Upvote
0 (0 / 0)
Why not just have a mobile chip on a card? Bill it as a 4050 ECO or something like that.
That's sometimes what companies do. Nvidia doesn't build separate mobile die any more. It's just that mobile GPUs have different branding than desktop GPUs.

For example: The mobile RTX 4090 uses AD103 GN21-X11 according to Wikipedia. On desktop, that GPU is used for the RTX 4070 Ti and RTX 4080.

Nvidia also has a mobile RTX 4050. It's a cut-down version of the RTX 4060 (AD107, in both cases). For whatever reason, they decided not to bring this part to desktop.
 
Upvote
5 (5 / 0)
I post this every...single...thread, but please please please will someone give us a modern bus-powered (75W) card for an appropriate price - something to replace the GTX1650 (still clocking in at #10 in the Steam GPU list, and ahead of every AMD card) and the GTX1050Ti (#20, ahead of all bar one AMD card).

The only modern bus-powered cards are workstation-class cards like the RTX A2000 and the RTX 4000 SFF Ada, with insane prices.
Serious question that I don't think I've ever asked you. Why do you want this? Are you trying to keep an old system alive?

I ask because small power supplies with a single 8-pin connection for the GPU are not hard to find, and I'm wondering if you are somehow stuck on an old system with a limited upgrade path or impossible-to-replace power supply.

Part of the reason why I think you find your options limited is because the PCIe standard was designed, from launch, with the idea that GPUs would use a secondary power connector. Power delivery in the PCIe slot happens over just the first few pins, which I think is part of the reason why the slot doesn't scale higher. But GPUs were drawing power from molex connectors before PCIe even debuted, and higher-end GPUs switched over to six-pin connectors almost immediately.

But regardless of that, I'm curious why you want this particular feature.
 
Upvote
6 (6 / 0)

steelcobra

Ars Tribunus Angusticlavius
9,384
Yeah. The connection between bus width and practical memory capacity always produces some odd points when you have bus widths that aren't a strict doubling of the previous one. The 3060 was the 192bit bus of that generation with the 3060ti and 3070 stepping up to 256bit bus. Good for the memory bandwidth but left them with less ram. Doubling that up to 16gb on those two cards would have given them more memory than anything that generation but the 3090/3090ti which wouldn't really have gotten rid of the odd memory sizing for that generation. Just changed where it was. The in-between size chips, 3gb instead of 2 or 4, help out because the 3060ti and 3070 could have been 12gb then but you still would have ended up with the odd step down to the 3080 at 10gb.

The 40 and 50 series have the same thing with the 192bit bus on the 70 series cards. They get stuck at 12 gb while there are 16gb cards above and below them. Again, pushing them up to 24gbs would be good for the 70 cards but then looks odd when the 80 cards only have 16gb.
But going from GDDR6X to 7 bumps the 4070's 504GBps to the 5070's 672 on the same 192bit bus.

Edit: I'll also note that the 4070 Ti (nee 4080 12GB) only had 12GB too. They only bumped it up to 16GB on the Super.
 
Upvote
1 (1 / 0)

steelcobra

Ars Tribunus Angusticlavius
9,384
Serious question that I don't think I've ever asked you. Why do you want this? Are you trying to keep an old system alive?

I ask because small power supplies with a single 8-pin connection for the GPU are not hard to find, and I'm wondering if you are somehow stuck on an old system with a limited upgrade path or impossible-to-replace power supply.

Part of the reason why I think you find your options limited is because the PCIe standard was designed, from launch, with the idea that GPUs would use a secondary power connector. Power delivery in the PCIe socket happens over just the first few pins, which I think is part of the reason why the slot doesn't scale higher. But GPUs were drawing power from molex connectors before PCIe even debuted, and higher-end GPUs switched over to six-pin connectors almost immediately.

But regardless of that, I'm curious why you want this particular feature.
The other reason I can think of is doing an ultra-small SFF build and not wanting to route a power cable.
 
Upvote
1 (1 / 0)
Thank you, I knew this product; it misses the sata ports.
Assuming you are using SATA for either storage or optical drives, do you have the option of using an external enclosure?

Either way, you said earlier that you game at 720p and that you care a lot about power efficiency. My honest advice to you would be to upgrade your monitor, but assuming you cannot or do not wish to do so -- you actually are kind of an ideal candidate for a card like the RTX 3050 6GB. If you needed to build a whole system (since you talked about possibly buying Strix Halo), you could do far worse than an AM4 Ryzen system (pick a chipset like B550 to save power), or a Core i5 from whichever Intel processor family you prefer.

Strix Halo would give you much more performance than the aforementioned RTX 3050, but you wouldn't pay $1000 - $2000 for the parts I mentioned above.

Of course that's only if you can use external enclosures for your current SATA devices. I do not think you will see any Strix Halo boards with SATA. The upcoming HP Z2 Mini G1a also uses SH, but does not offer any internal SATA ports.
 
Upvote
3 (3 / 0)

steelcobra

Ars Tribunus Angusticlavius
9,384
Assuming you are using SATA for either storage or optical drives, do you have the option of using an external enclosure?

Either way, you said earlier that you game at 720p and that you care a lot about power efficiency. My honest advice to you would be to upgrade your monitor, but assuming you cannot or do not wish to do so -- you actually are kind of an ideal candidate for a card like the RTX 3050 6GB. If you needed to build a whole system (since you talked about possibly buying Strix Halo), you could do far worse than an AM4 Ryzen system (pick a chipset like B550 to save power), or a Core i5 from whichever Intel processor family you prefer.

Strix Halo would give you much more performance than the aforementioned RTX 3050, but you wouldn't pay $1000 - $2000 for the parts I mentioned above.

Of course that's only if you can use external enclosures for your current SATA devices. I do not think you will see any Strix Halo boards with SATA. The upcoming HP Z2 Mini G1a also uses SH, but does not offer any internal SATA ports.
I mean, the more reasonable option if they just want a light, low power gaming box is one of Framework's laptop boards, the "desktop" case they sell, and stick any SATA drives in a USB enclosure.
 
Upvote
1 (1 / 0)
I mean, the more reasonable option if they just want a light, low power gaming box is one of Framework's laptop boards, the "desktop" case they sell, and stick any SATA drives in a USB enclosure.
I think that's also a completely viable option, depending on the frame rate they want to target. THG suggests the Ryzen AI HX 375 delivers ~45 FPS at 720p. I'd expect the 3050 to still outperform that, and a 6GB frame buffer really won't be a limit at that resolution.

But I 100% agree that your idea is a good one depending on what they want.
 
Upvote
1 (1 / 0)

euskalzabe

Smack-Fu Master, in training
87
It's more complicated than that.

FTC.gov states: "The key word is "suggested." A dealer is free to set the retail price of the products it sells. A dealer can set the price at the MSRP or at a different price, as long as the dealer comes to that decision on its own. However, the manufacturer can decide not to use distributors that do not adhere to its MSRP."

In other words: Yes, AMD can choose not to sell to dealers who will not sell product at MSRP, but it cannot dictate the price of individual products. It's take it or leave it. This is why EVGA left the GPU market. It didn't like Nvidia's "suggestions" for product pricing.

Sapphire can choose to stop manufacturing AMD cards. AMD can choose to stop selling Sapphire GPUs because it doesn't like the % of GPUs being sold at MSRP. What AMD can't do is force Sapphire to set pricing Sapphire doesn't want to set. AMD also cannot unilaterally dictate terms to any new supplier.
You're 100% correct, I should've made my point more clearly. When I said AMD and Nvidia could dictate pricing, I didn't mean they had an actual legal right to do that (as you said, the MSRP stands for "suggested"). However, they do have the... let's not say power, but the ability, to keep AIBs in check from price gouging everyone else (which does not benefit AMD and Nvidia one bit, as the extreme prices we've been seeing for a few years damages brand name - consumers get more mad, from everything I read, at Nvidia and AMD, than, say, Asus or Zotac).
Yes, they could. However, this would involve AMD and/or Nvidia creating its own AIB business. Nvidia has a limited business with FE cards, but neither AMD nor Nvidia wants to be a direct board manufacturer with large enough product lines to take over the entire GPU business. According to JPR, there were something like 30 million desktop GPUs sold in 2024. That's a serious manufacturing business to stand up from scratch.

The story of 3dfx is still a cautionary tale. The problem with attempting to compete with all of your former AIB partners is that customers may resent being forced to drop their own brand loyalty and switch to you directly. And it means everyone is getting exactly the same GPU, with no room for brand differentiation around fans, cooler design, cooler shape, card size, colors, RGB, or any other feature you can think of.
I also get your point, but it's not like it's all or nothing. AMD/NV could have a reservation system for their FE cards (and reference cards for AMD, though they don't do that now). They wouldn't even have to produce more, just keep people waiting in line for MSRP. I'd rather get a card 6 months later at MSRP, while others who want it now could go the AIB route and pay whatever they are overcharging due to differentiation or bogus reasons. Still, the point is there would be options. Right now, we're all hostage to the current system. I was able to get an eVGA 3060 during the pandemic precisely because I was waitlisted. Right now, the only way to get something at or close to MSRP is waiting for about month 9 of a GPU's life, a time when it'll likely be at or around MSRP before production lines are stopped, stock goes down and price goes up again as the next generation starts being produced).

My overarching point is that the market conditions we've had for 5 years now are insane, and it doesn't have to be this way. There are many ways to alleviate or entirely prevent this type of price gouging. Inexplicably, it all just keeps happening with no one doing anything about it. I thought people would give up on buying overpriced GPUs, but I guess there's a ton more shortsighted dummies out there making bad financial decisions than I thought (and yes, I know some people need GPUs for work, but that's not the majority of RX/RTX consumers).
 
Upvote
2 (2 / 0)

powerage76

Smack-Fu Master, in training
70
The only modern bus-powered cards are workstation-class cards like the RTX A2000 and the RTX 4000 SFF Ada, with insane prices.
The used market is your friend. The A2000 is pretty cheap if you are lucky, I've recently got one with 12GB. Cleaned it up, repasted and replaced the thermal pads and it works well.
It is getting similar to the notebook market: instead of buying overpriced current gen consumer stuff, there is always the previous gen of the professional grade.
 
Upvote
0 (0 / 0)

Spunjji

Ars Scholae Palatinae
842
It's the same TSMC node and looks like there's not much else to improve architecturally (as there was with Maxwell).

RDNA 4.0 rasterwise hasn't improved that much as well, AMD just massively improved their subpar RT capabilities and added proper AI cores to enable FSR 4.0.
Not sure that's true - the 9070 XT is about ~30% faster at 4K than the 7900 GRE with fewer compute cores. A big chunk of that is clock speed, but it's not just clock speed, and clock speed is a factor of architecture as well.

The 9800X3D should be compared to the Ryzen 7 1700X, which was released at $329.
Why? The 9800X3D is AMD's top-end 8-core offering, like the 1800X once was.

The 1800X ($500) was the top of the line CPU, similar to the 9950X3D that was released at $700.
Well no, their top-of-the-line CPUs back then were Threadrippers, so you'd probably compare to the 16-core Threadripper because that was the closest CPU comparable to what the 9950X3D offers now. It's an awkward comparison, but makes way more sense than yours.

Desktop Zen 5 CPUs without 3D cache should never have been released in the first place.
Why? Why should people who don't need 3D cache be forced to pay for it? (Or, alternatively, why should AMD be forced to produce an extra chunk of cache die and then have TSMC do some seriously advanced packaging with it for free?)

They're just there to make you think the company isn't ripping you off. AMD has sneakily made everything quite more expensive, but it's fine as long as Intel cannot compete.
Evil AMD, making users pay $699 for their top-end 16-core X3D chip, while a mere 7 years ago you could buy a top-end 14-core i9 9990XE for only $2800 or ~$3500 in today's money. See, I too can make silly comparisons.

I don't understand this "gaming" moniker. The Intel Core i5 2500K, 216 mm², perhaps the most popular CPU ever, was a gaming CPU as well. It was under $220. The 9800X3D die size? 70.6 mm² + the IO die 122 mm² that probably costs AMD $25 to produce.
The "gaming" moniker is because it's the fastest chip on the planet for gaming, but if you don't need or want that you can feel free to buy something a little more directly comparable to that i5 - like, say, a Ryzen 5 9600X, which with a $279 MSRP comes in a little below the venerable 2500K's inflation-adjusted $312 and will give you extremely healthy performance in any game out there. Naughty AMD!

Given the above, and WCCFtech's legendary pro-Nvidia slant, I'm starting to feel like your insistence that the whole internet is pro-AMD is a bit like the far right's continued insistence that the corporate billionaire-owned media in the USA is "liberal".
 
Upvote
5 (5 / 0)

Altar@

Smack-Fu Master, in training
7
Assuming you are using SATA for either storage or optical drives, do you have the option of using an external enclosure?

Either way, you said earlier that you game at 720p and that you care a lot about power efficiency. My honest advice to you would be to upgrade your monitor, but assuming you cannot or do not wish to do so -- you actually are kind of an ideal candidate for a card like the RTX 3050 6GB. If you needed to build a whole system (since you talked about possibly buying Strix Halo), you could do far worse than an AM4 Ryzen system (pick a chipset like B550 to save power), or a Core i5 from whichever Intel processor family you prefer.

Strix Halo would give you much more performance than the aforementioned RTX 3050, but you wouldn't pay $1000 - $2000 for the parts I mentioned above.

Of course that's only if you can use external enclosures for your current SATA devices. I do not think you will see any Strix Halo boards with SATA. The upcoming HP Z2 Mini G1a also uses SH, but does not offer any internal SATA ports.
Well, I have currently an Intel Gen 4 system with a normal stuff attached with. I'd want to avoid 1)a graphic card and 2)cloning a 2.5 SSD to a Nvme one or reinstalling on a new device (as I have some stuff painful to manage inside) so the proposition of Strix Halo in a desktop factor was interesting as I (logically) thought that there will have some Sata ports . For the moment, it seems not. Hope that some manufacturers will change this.
 
Upvote
0 (0 / 0)

steelcobra

Ars Tribunus Angusticlavius
9,384
Well, I have currently an Intel Gen 4 system with a normal stuff attached with. I'd want to avoid 1)a graphic card and 2)cloning a 2.5 SSD to a Nvme one or reinstalling on a new device (as I have some stuff painful to manage inside) so the proposition of Strix Halo in a desktop factor was interesting as I (logically) thought that there will have some Sata ports . For the moment, it seems not. Hope that some manufacturers will change this.
Honestly, just upgrade and migrate your files over, it'll cost you less and be a far better experience once you get through it.

There's also options like the Beelink SER9 to get a tinyPC with Strix Point. I don't think right now there's any Strix Halo tiny PCs available other than the Framework Desktop preorders, though.

Also, sticking with a 720p monitor while looking for a high-end APU setup is a bit weird, you can easily find 24" 1080p120Hz monitors for $100 these days.
 
Upvote
1 (1 / 0)

Coolie

Ars Praetorian
406
Subscriptor++
Well, I have currently an Intel Gen 4 system with a normal stuff attached with. I'd want to avoid 1)a graphic card and 2)cloning a 2.5 SSD to a Nvme one or reinstalling on a new device (as I have some stuff painful to manage inside) so the proposition of Strix Halo in a desktop factor was interesting as I (logically) thought that there will have some Sata ports . For the moment, it seems not. Hope that some manufacturers will change this.
Since you really want SATA…
https://www.startech.com/en-us/cards-adapters/4p-sata-m2-adapter
 
Upvote
1 (1 / 0)

Coolie

Ars Praetorian
406
Subscriptor++
There's also options like the Beelink SER9 to get a tinyPC with Strix Point. I don't think right now there's any Strix Halo tiny PCs available other than the Framework Desktop preorders, though.
HP showed this one previously:
https://liliputing.com/hp-z2-mini-g...md-strix-halo-and-up-to-96gb-graphics-memory/

GMK has stated they’ll launch one i. the next 3 months or so, similar to their current Strix Point box:
https://videocardz.com/newz/gmk-con...395-strix-halo-will-launch-between-q1-q2-2025

I personally would bet against there being a SATA connection on either of those mini-PCs’ boards though…
 
Last edited:
Upvote
1 (1 / 0)

blazeoptimus

Smack-Fu Master, in training
53
Honestly, the biggest disapointment to me in the 3050 is the 130w power requirement. There are many of us who like to do SFF builds. Going to 130w limits your SFF build considerably. You will mandatorily need a at least an additional 75w power connector. If you want to use a low profile card, its almost guaranteed to be dual slot. Even with the 6gb 3050 being 75w, it took quite a while for yeston to finally create a single slot low-profile card. At 130w, its hard to imagine we'll ever see a single slot LP 5050 card. I'm hoping they do something similar to the 3050 and release a further cut-down version of the card. For people like me, its more important that it fit into that 75w Single-slot - LP category. It means you can add the card to just about any type of PC build.

For the record, obviously Strix-Halo is a good option, but not the end all to be all. A 75w 5050 would cover older machines, giving them modern graphics capabilities. It would also allow machines with CPUs that that have minimal graphics integrated into the cpu (and there still are many of them being sold) to get a nice bump in capabilities, without requiring a full on gamer's case and power supply to accomodate them.
 
Upvote
1 (1 / 0)

steelcobra

Ars Tribunus Angusticlavius
9,384
HP showed this one previously:
https://liliputing.com/hp-z2-mini-g...md-strix-halo-and-up-to-96gb-graphics-memory/

GMK has stated they’ll launch one i. the next 3 months or so, similar to their current Strix Point box:
https://videocardz.com/newz/gmk-con...395-strix-halo-will-launch-between-q1-q2-2025

I personally would bet against there being a SATA connection on either of those mini-PCs’ boards though…
Well, I did say the Framework is the only one you can put money down on today. And yeah, I think the Framework is the only one with an actual PCIe slot, though you can get an M.2 NVMe to SATA adapter card if you really need it.
1741888412125.jpeg
 
Upvote
4 (4 / 0)

evan_s

Ars Tribunus Angusticlavius
6,386
Subscriptor
HP showed this one previously:
https://liliputing.com/hp-z2-mini-g...md-strix-halo-and-up-to-96gb-graphics-memory/

GMK has stated they’ll launch one i. the next 3 months or so, similar to their current Strix Point box:
https://videocardz.com/newz/gmk-con...395-strix-halo-will-launch-between-q1-q2-2025

I personally would bet against there being a SATA connection on either of those mini-PCs’ boards though…

I wonder if it actually doesn't include SATA support in the baseline features of the platform. It actually wouldn't surprise me as a modern higher end laptop platform primarily I don't think there is a lot of uses for it. I expect storage to be M2 NVME and I doubt Optical drives will be common. It may well require a PCI-E controller on the MB to actually have SATA ports.
 
Upvote
3 (3 / 0)

Coolie

Ars Praetorian
406
Subscriptor++
Well, I did say the Framework is the only one you can put money down on today. And yeah, I think the Framework is the only one with an actual PCIe slot, though you can get an M.2 NVMe to SATA adapter card if you really need it.
Perhaps, but at least the HP is confirmed, and should be coming soon.

They announced a Q2 release. If that happens, it would be on the market before Framework’s first shipments go out (Q3).

HP really hasn’t done much marketing around that Z2… it was announced at CES, but seems almost no news since. Surprising they didn’t do a quick update on release date or something with all the press Framework was getting.

(For the adapter, m.2 slot-> SATA drive, can’t go wrong with StarTech’s version I linked above… it’s expensive, but they’re a decent brand.)
 
Upvote
0 (0 / 0)
My overarching point is that the market conditions we've had for 5 years now are insane, and it doesn't have to be this way. There are many ways to alleviate or entirely prevent this type of price gouging. Inexplicably, it all just keeps happening with no one doing anything about it. I thought people would give up on buying overpriced GPUs, but I guess there's a ton more shortsighted dummies out there making bad financial decisions than I thought (and yes, I know some people need GPUs for work, but that's not the majority of RX/RTX consumers).
It hasn't just been the last five years. It's been nearly nine years since Pascal (GTX 1080) launched. I'd say that GPU prices have been substantially greater than MSRP for at least 5 years total of that nine-year period.

There have been narrow intervals when GPU prices drifted closer to MSRP and one could get a card, before cryptocurrency / COVID / AI destroyed the market again.

But when cards have stayed elevated for more than 50% of the near-decade since a given GPU launched, it's impossible to say that's a deviation from the norm. The norm is higher prices. The unusual state is affordable cards.

And yeah. I agree that AMD and NV could do more.
 
Upvote
3 (3 / 0)
Well, I have currently an Intel Gen 4 system with a normal stuff attached with. I'd want to avoid 1)a graphic card and 2)cloning a 2.5 SSD to a Nvme one or reinstalling on a new device (as I have some stuff painful to manage inside) so the proposition of Strix Halo in a desktop factor was interesting as I (logically) thought that there will have some Sata ports . For the moment, it seems not. Hope that some manufacturers will change this.
While you might be able to get away with swapping Intel Haswell for AMD Strix Halo, the potential for performance degradation is high. Windows might work, but I wouldn't bet on your system running at top speed without a lot of low-level driver swapping and tests to confirm it was doing so.

As some have pointed out, however, you can get M.2 cards that offer up to 4 SATA ports, which might be sufficient for your use.
 
Upvote
4 (4 / 0)
Honestly, the biggest disapointment to me in the 3050 is the 130w power requirement. There are many of us who like to do SFF builds. Going to 130w limits your SFF build considerably. You will mandatorily need a at least an additional 75w power connector.

If you want to use a low profile card, its almost guaranteed to be dual slot. Even with the 6gb 3050 being 75w, it took quite a while for yeston to finally create a single slot low-profile card. At 130w, its hard to imagine we'll ever see a single slot LP 5050 card. I'm hoping they do something similar to the 3050 and release a further cut-down version of the card. For people like me, its more important that it fit into that 75w Single-slot - LP category. It means you can add the card to just about any type of PC build.
Low profile cards are likely to be dual slot because the cooler can't be taller, so it must be thicker.

I seriously doubt that the <130W TDP graphics category is long for this world as anything other than a cast-off market for higher TDP products. And no, we probably won't see a single-slot, low-profile RTX 5050 @ 130W TDP because it's too difficult to keep such a product cool without creating a lot of disagreeable fan noise most people don't want to deal with.

But who knows? If the RTX 3050 6GB is popular enough, Nvidia might do an RTX 5050 6GB @ 75W, too.

For the record, obviously Strix-Halo is a good option, but not the end all to be all. A 75w 5050 would cover older machines, giving them modern graphics capabilities. It would also allow machines with CPUs that that have minimal graphics integrated into the cpu (and there still are many of them being sold) to get a nice bump in capabilities, without requiring a full on gamer's case and power supply to accomodate them.
You don't need a "full on gamers case" to accomodate them now. Newegg suggests mini-ITX-compatible chassis currently start around $60. I'd expect a 400 - 450W PSU to handle a 130W GPU with absolutely no problem.

Do you have a fleet of already-built SFFs you are trying to keep running? Because sure, there are still chips with minimal graphics capabilities being sold -- that's nothing new -- but for any customer who cared at all about graphics, in any particular, I'd choose a case that would accomodate a full-size, dual-slot GPU to guarantee future compatibility. Anyone who deliberately chose a super-small SFF with no room for a PCIe cable and no space for a standard GPU would be advised that the configuration they had chosen would prevent them from meaningfully upgrading the system.
 
Upvote
2 (2 / 0)

steelcobra

Ars Tribunus Angusticlavius
9,384
Low profile cards are likely to be dual slot because the cooler can't be taller, so it must be thicker.

I seriously doubt that the <130W TDP graphics category is long for this world as anything other than a cast-off market for higher TDP products. And no, we probably won't see a single-slot, low-profile RTX 5050 @ 130W TDP because it's too difficult to keep such a product cool without creating a lot of disagreeable fan noise most people don't want to deal with.

But who knows? If the RTX 3050 6GB is popular enough, Nvidia might do an RTX 5050 6GB @ 75W, too.


You don't need a "full on gamers case" to accomodate them now. Newegg suggests mini-ITX-compatible chassis currently start around $60. I'd expect a 400 - 450W PSU to handle a 130W GPU with absolutely no problem.

Do you have a fleet of already-built SFFs you are trying to keep running? Because sure, there are still chips with minimal graphics capabilities being sold -- that's nothing new -- but for any customer who cared at all about graphics, in any particular, I'd choose a case that would accomodate a full-size, dual-slot GPU to guarantee future compatibility. Anyone who deliberately chose a super-small SFF with no room for a PCIe cable and no space for a standard GPU would be advised that the configuration they had chosen would prevent them from meaningfully upgrading the system.
There's also a bunch of tiny cases that can accommodate pretty big cards, too.
 
Upvote
2 (2 / 0)

Altar@

Smack-Fu Master, in training
7
While you might be able to get away with swapping Intel Haswell for AMD Strix Halo, the potential for performance degradation is high. Windows might work, but I wouldn't bet on your system running at top speed without a lot of low-level driver swapping and tests to confirm it was doing so.

As some have pointed out, however, you can get M.2 cards that offer up to 4 SATA ports, which might be sufficient for your use.
Yes, I did not know the M.2 cards with Sata ports stuff. Will definitely look into it with the Framework Mini-ITX MB.
 
Last edited:
Upvote
0 (0 / 0)

steelcobra

Ars Tribunus Angusticlavius
9,384
All of that to avoid a fresh OS on new hardware is, to be frank, incredibly stupid.

If you're that desperate just clone the disk.
Yeah, one of the big upgrades I had going from SATA to NVMe was just how much the faster drives remove CPU overhead. Installing a new game on the old one? It would swamp the SATA interface and max CPU cycles to hold the data in queue. With my current setup Steam et al at near gigabit download speeds aren't even getting close to the max write limits.
 
Upvote
0 (0 / 0)
While you might be able to get away with swapping Intel Haswell for AMD Strix Halo, the potential for performance degradation is high.
It sounds like they're using a Haswell with the onboard graphics, not a dGPU. Saying that a Strix Halo could in any way be a performance degradation compared to a Haswell is freaking ludicrous. Everything about the Strix will be insanely faster. It's got four times the cores with at least twice the throughput per core, probably a lot more. And the iGPU will be wildly more powerful. It should be more powerful than even a dGPU from the Haswell era, which would have been something like a GTX 970.

I'd have to look it up to give more exact figures, but "one metric assload faster" will work as a lazy answer.

Obviously, I/O bound things will still be I/O bound, and not replacing the SATA drive is purely foolish, but that's the poster's problem, not the chip's or the system's.
 
Last edited:
Upvote
1 (2 / -1)

steelcobra

Ars Tribunus Angusticlavius
9,384
It sounds like they're using a Haswell with the onboard graphics, not a dGPU. Saying that a Strix Halo could in any way be a performance degradation compared to a Haswell is freaking ludicrous. Everything about the Strix will be insanely faster. It's got four times the cores with at least twice the throughput per core, probably a lot more. And the iGPU will be wildly more powerful. It should be more powerful than even a dGPU from the Haswell era, which would have been something like a GTX 970.

I'd have to look it up to give more exact figures, but "one metric assload faster" will work as a lazy answer.

Obviously, I/O bound things will still be I/O bound, and not replacing the SATA drive is purely foolish, but that's the poster's problem, not the chip's or the system's.
Or they're probably still on Win7 and several other issues with it that won't play well with a modern CPU.

And if that's the case, he should just convert that to a VM instead.
 
Upvote
2 (2 / 0)

Coolie

Ars Praetorian
406
Subscriptor++
Or they're probably still on Win7 and several other issues with it that won't play well with a modern CPU.

And if that's the case, he should just convert that to a VM instead.
While I agree converting to a VM would be bet where possible, getting a GPU working properly within a VM is not always the easiest thing to do and definitely not guaranteed to give playable framerates; having a box which can run Win7 natively might be useful in such a case.

(But then keeping that box as-is would probably make more sense rather than upgrading and introducing Win7 compatibility issues with post-Win10 hardware, so no idea then… only the OP can clarify.)
 
Upvote
0 (0 / 0)
It sounds like they're using a Haswell with the onboard graphics, not a dGPU. Saying that a Strix Halo could in any way be a performance degradation compared to a Haswell is freaking ludicrous. Everything about the Strix will be insanely faster. It's got four times the cores with at least twice the throughput per core, probably a lot more. And the iGPU will be wildly more powerful. It should be more powerful than even a dGPU from the Haswell era, which would have been something like a GTX 970.

I'd have to look it up to give more exact figures, but "one metric assload faster" will work as a lazy answer.

Obviously, I/O bound things will still be I/O bound, and not replacing the SATA drive is purely foolish, but that's the poster's problem, not the chip's or the system's.
sigh

You missed the context of my statement.

The OP wrote: "I'd want to avoid 1)a graphic card and 2)cloning a 2.5 SSD to a Nvme one or reinstalling on a new device (as I have some stuff painful to manage inside) so the proposition of Strix Halo in a desktop factor was interesting.

Now, I may have misunderstood the OP, but here's how I read that:

1). He doesn't want to clone a drive.
2). He doesn't want to reinstall his OS on a new device.

If you don't want to clone and you don't want to reinstall your OS, the only alternative I see is that he wants to bring his OS installation from a Haswell system to a Strix Halo system.

And -- under that set of conditions -- my warning should make sense.

While it may be possible with modern Windows to bring a 4th Gen Haswell system installation over to Strix Halo, the risk of performance degradation (relative to what Strix Halo would otherwise achieve) is high. Such a system would be leaping many generations of hardware and switching chipset vendors and CPU architectures and using chips that rely on core parking and other features that are typically detected by Windows and set at a low level during installation. AMD has said that the 9000 Series no longer requires a full OS reinstall when you switch processors between a 1xCCD and a 2xCCD model the way the old 7000 Series did, but I have no idea if that's going to apply when upgrading from a 4th Gen Intel to a modern AMD system. That's a literal decadal jump.

It's not the bad old days, where anything less than careful preparation got you a BSOD for your trouble, but no -- I would not seek to pull a 4th Gen OS from Haswell into a modern Strix Halo APU. The chance of old driver cruft sticking around and gumming things up, or low-level instability, or just a performance hit relative to what the system would otherwise achieve is too great. The only way to know if you're getting the performance you should be getting will be to run low-level and high-level benchmarks both and then compare the results to other people's published numbers. That's a pain.
 
Upvote
2 (2 / 0)

steelcobra

Ars Tribunus Angusticlavius
9,384
sigh

You missed the context of my statement.

The OP wrote: "I'd want to avoid 1)a graphic card and 2)cloning a 2.5 SSD to a Nvme one or reinstalling on a new device (as I have some stuff painful to manage inside) so the proposition of Strix Halo in a desktop factor was interesting.

Now, I may have misunderstood the OP, but here's how I read that:

1). He doesn't want to clone a drive.
2). He doesn't want to reinstall his OS on a new device.

If you don't want to clone and you don't want to reinstall your OS, the only alternative I see is that he wants to bring his OS installation from a Haswell system to a Strix Halo system.

And -- under that set of conditions -- my warning should make sense.

While it may be possible with modern Windows to bring a 4th Gen Haswell system installation over to Strix Halo, the risk of performance degradation (relative to what Strix Halo would otherwise achieve) is high. Such a system would be leaping many generations of hardware and switching chipset vendors and CPU architectures and using chips that rely on core parking and other features that are typically detected by Windows and set at a low level during installation. AMD has said that the 9000 Series no longer requires a full OS reinstall when you switch processors between a 1xCCD and a 2xCCD model the way the old 7000 Series did, but I have no idea if that's going to apply when upgrading from a 4th Gen Intel to a modern AMD system. That's a literal decadal jump.

It's not the bad old days, where anything less than careful preparation got you a BSOD for your trouble, but no -- I would not seek to pull a 4th Gen OS from Haswell into a modern Strix Halo APU. The chance of old driver cruft sticking around and gumming things up, or low-level instability, or just a performance hit relative to what the system would otherwise achieve is too great. The only way to know if you're getting the performance you should be getting will be to run low-level and high-level benchmarks both and then compare the results to other people's published numbers. That's a pain.
Jayz2Cents managed to do a 9800X3D to 9950X3D switch without full reinstall that worked (while risking screwing up config of his GPU test bench) out fine. But yeah, there's no way a Haswell native OS will play nice with Strix Halo.
 
Upvote
2 (2 / 0)
Jayz2Cents managed to do a 9800X3D to 9950X3D switch without full reinstall that worked (while risking screwing up config of his GPU test bench) out fine. But yeah, there's no way a Haswell native OS will play nice with Strix Halo.
9000 Series is supposed to avoid that problem. Glad to hear it does.

But I don't think AMD actively develops drivers for anything below Windows 10. So, @Altar@ , be advised: Unless you are running at least Windows 10, you won't have driver support with Strix Halo. It's not going to support Windows 7 / 8.
 
Upvote
1 (1 / 0)