Nvidia nudges mainstream gaming PCs forward with RTX 5060 series, starting at $299

After months of frustration trying to get the Intel ARC B580 at MSRP (or at all), I finally tapped out and bought a last-gen gaming laptop that was on a close-out sale. It feels like GPU vendors just don't care about stock or prices these days.
ATI cares, they were just caught off guard by how badly the nvidia 50xx releases have gone.

Had they known what an opportunity they'd have then maybe they could've locked in more GPU production with TSMC 6-12 months ahead of time.
 
Upvote
5 (5 / 0)

SittingDuc

Smack-Fu Master, in training
75
It is insane to me that Nvidia continues to be so stingy with VRAM.

I would’ve thought the bare minimum, lowest end cards would at least be at 12GB, but even that seems a bit too low for 2025. 16GB seems more reasonable.

AMD doesn’t seem to be as stingy (though they did launch 8GB cards last gen too), so would it be a supply chain constraint? Outside of artificially limiting these cards capabilities to encourage people to upgrade sooner, I fail to understand what the deal is with sticking to 8GB.
Greed. nVidia wants to sell as many AI-cards as they can for the most money they can. But nVidia fears that a 5060 with more than 8GB of RAM becomes "enough useful" to AI customers that instead of spending $$$$ on AI-cards, they only spend $$$ on video cards. So to maximise profit in a segmented market they cripple the video card; and gamers just have to put up with it because video cards is $100M/year and AI is $1B-10B/year.

Of course, gamers have options other than putting up with it <looks over at shiny new Radeon board> but honestly, the AI dollar is so attractive and big and the gamer dollar is so small, nVidia doesn't have to care even if everyone leaves.
 
Upvote
7 (7 / 0)

Stickmansam

Ars Scholae Palatinae
932
It is insane to me that Nvidia continues to be so stingy with VRAM.

I would’ve thought the bare minimum, lowest end cards would at least be at 12GB, but even that seems a bit too low for 2025. 16GB seems more reasonable.

AMD doesn’t seem to be as stingy (though they did launch 8GB cards last gen too), so would it be a supply chain constraint? Outside of artificially limiting these cards capabilities to encourage people to upgrade sooner, I fail to understand what the deal is with sticking to 8GB.
The difference is that AMD targetted the performance of the 8GB cards to 1080p class while Nvidia 8GB cards are in the 1440p performance class where the VRAM matters more.
 
Upvote
10 (10 / 0)
Sure you could buy AMD, IF all you do is gaming and that's it. But good luck with that if you change your mind later and want compute support that Just Works. I'm not even talking about LLM BS, just basic compute support if you want to try a new Blender recipe, ignite a pyTorch, play with the Hashcat, and so on.

AMD has effectively ceded the mid to lower end compute market to Nvidia. Intel's OneAPI would be viable if you could get the hardware, which you can't usually. AMD is now half assing it trying to scramble for ROCm/HIP support after finally half way listening to what people have been trying to tell them for years: the RX 9070 still doesn't have full ROCm/HIP support on par with CUDA despite the supposed outreach efforts and may or may not ever get official full support. (FWIW, Windows has very little presence outside of Blender in the HPC realm even as a workstation. If it's not a Linux compute station you're leaving a lot of performance on the table.)

Right now the only game in town is still Nvidia & CUDA that you might be able to get your paws on. Intel is a no show, and AMD can't seem to get it right.

I share your disdain for these companies putting out 8GB cards, btw. The absolute bare minimum at this point should be nothing less than 12GB VRAM, preferably 16.

I'm to the point where I'm annoyed with the PC market when it comes to actually getting hardware in hand that I"m seriously looking at Macs since they have very good seamless compute/rendering support through Metal. Once you add in the premium on scalped hardware you often come out better with a Mac Mini.

I wonder how many people actually end up using all that CUDA stuff in the end. A lot of sites/people will recommend Nvidia even if cheaper AMD cards are faster because of "you might be a youtuber one day!" or "suurre you'll get into 3D modeling any day now". A lot of people are aspiring "content creators" and stuff but it seems weird that "good at non-gaming tasks" trumps "better at games" when talking about hardware specifically made for games. That's like people complaining about electric cars because they might want to drive up a mountain in -30C weather with heavy snow some day.
 
Upvote
12 (12 / 0)

Demento

Ars Legatus Legionis
14,499
Subscriptor
Depends of what you find "acceptable". A lot of games run fine at 8GB as long as you have pure raster at mid settings, no DLSS FG, no RT. So as long as you disable features, that are Nvidia's biggest selling points. With these enabled, you won't have enough VRAM even for FHD gameplay.

View attachment 107688View attachment 107689
You seem to be backing me up that it's perfectly acceptable at "High" for 1440, let alone 1080. Also, please note that VRAM usage is not the same thing as VRAM requirement . Most, if not all, games will happily cache more data than they need if given the extra storage to put it in. There are plenty of examples of games that use more than 8GB on a 12GB card but will still run on 8.

I would agree that there's zero future proofing in an 8GB card, but it's quite clear that you're expected to pay for that "feature". It runs everything current at FHD/QHD just fine now, and probably will for a few years yet, given the install base.
 
Upvote
-7 (0 / -7)

mpat

Ars Tribunus Angusticlavius
6,245
Subscriptor
ATI cares, they were just caught off guard by how badly the nvidia 50xx releases have gone.

Had they known what an opportunity they'd have then maybe they could've locked in more GPU production with TSMC 6-12 months ahead of time.

It isn't ATi, it is AMD - and I'm not (just) being a nitpick here, because it actually matters. AMD reserves production slots for the company as a whole, and they have been open with the fact that their Ryzen CPUs are currently selling better than expected because Intel keeps stumbling. They have likely moved production capacity to the CPU products, as they're on the same production node (4nm TSMC).
 
Upvote
3 (3 / 0)

Fatesrider

Ars Legatus Legionis
22,963
Subscriptor
This may actually be damning to anyone willing to read through the bullshit. With MFG vs single frame gen, and even with the increased core count and faster memory, and with hand-picked benchmark results, they only manage to double framerate? I'm betting actual performance improvements will be very lackluster.
I think context will be an important factor to folks looking at buying a new video card. So the real question will be compared to what and at what actual cost?

Cost balances are involved outside of a "replacement". And most folks tend to approach their computers with a "if it ain't broke don't fix it" attitude.

Ars folks are the notable exception. We're like Oliver Wendle Jones from Bloom County throwing out the 6 month old Banana Jr. 6000 because it had only one floppy drive and is obsolete in the face of the new Banana Jr 6000e-II, which has 2 of them.

So, you're not wrong when comparing the newest to the previous the latest and greatest. But who in their right mind keeps pissing in their computer soup every time the next latest and greatest appears, and then has to deal with the associated blood loss riding technology's cutting edge?

In real life, your comparisons aren't going to apply to anyone other than those who have more money than God, an excellent clotting factor and an utter indifference to financial scars and pain. (From what I gather, the average Ars reader, for example). For the rest of the world, this could be the bee's knees by comparison, too, but depending on price and relative value when compared to what they already have and what's been recently released in the last couple of years, it probably isn't something people will be rushing out to buy.

Corporations and other special interests yes, home/office/business computer users...? Probably not.

I mean, yeah, everyone has their benchmarks and budgets. You're not wrong, either. But so narrowly defining the parameters to what immediately preceded it doesn't really reflect a reality most folks live in. More tellingly, as we start reaching the physical the limits of the technology, I don't think massive improvement in specs like we had 20 years ago when processors doubled in everything every three months will be happening again with the methodology we have today.

IMHO, that's already the case, so any improvements will likely be increasingly incremental without a technological breakthrough that, obviously, they haven't had for a bit. And that's usually not worth upgrading for, unless your bottom line massively depends on the fractions of pennies you can earn in a few seconds.
 
Upvote
2 (2 / 0)

googyflip

Smack-Fu Master, in training
4
A few have mentioned the lack of RAM increases through the 4000 and now 5000 series, which I was concerned about but found an article on Techradar (unable to link it for some reason). There's a new feature on 5000 RTX called 'RTX Neural Shaders' which is a technology helps with texture compression they report can save x7 more VRAM than traditional textures.
 
Upvote
-7 (0 / -7)
As with its other 50-series announcements, Nvidia is leaning on its DLSS Multi-Frame Generation technology to make lofty performance claims—the GPUs can insert up to three AI-interpolated frames in between each pair of frames that the GPU actually renders. The 40 series could only generate a single frame, and 30-series and older GPUs don't support DLSS Frame Generation at all. This makes apples-to-apples performance comparisons difficult.

This is a really dumb question, but can't you just turn multi-frame generation off and then you'd have a good apple-to-apples performance comparison? At least from what i've seen, the frame gen doesn't... seem to look that good? So comparing just in terms of raw power seems like a good way to see how much or little Nvidia GPUs are actually improving over the marketing fluff.
 
Upvote
3 (3 / 0)
I've yet to feel any urge to replace my 1060 personally, it already runs basically everything.
I feel the same about my 1080Ti FE. But, I really want to put that in a second PC now, so last week I managed to snag a Sapphire Nitro+ RX9070XT for less than I paid for my 1080Ti 8 years ago. It's an overclocked card so I wasn't expecting to be MSRP, but it wasn't much above MRSP. In fact, the price has gone up £85 since I placed my order. Just got to wait until next month for delivery.

I've bought a couple of Radeon cards for specific purposes - an HD3450 in an old HTPC and a couple of Radeon Pro VII for their FP64 performance - but I've been buying Nvidia cards for my use-everyday-PC for 25 years. But not this time, I just don't like Nvidia very much at the moment.

For the performance, I'd pay MSRP for a 5090 and the performance would help me forget how much I don't like Nvidia at the moment. But, you can't buy one at MSRP and I won't pay the scalpers.
 
Upvote
4 (4 / 0)
I'm not sure how I feel anymore about NV. They are a heavy supporter/enabler of LLMs and LLMs seem to be increasingly used by fascist governments for propaganda and oppression.

Its really odd. I've been in the field for years, and GPUs are probably my favorite computer HW. But it is getting harder to keep seeing them as a toy as they are weaponized.
Ho you triggered the nerds 🤣
 
Upvote
-11 (0 / -11)

sword_9mm

Ars Legatus Legionis
24,113
Subscriptor
This is the way, imo. I just bought Doom 2016 for £1.99. Looking forward to playing it at 60 FPS on Ultra with my 1060 potato.

After that maybe Alien: Isolation (£3.49).

That's what a SteamDeck is for in my world.

My PC is still rocking a 970.. Hasn't been booted in over a year.
 
Upvote
3 (3 / 0)

SportivoA

Ars Scholae Palatinae
779
This is a really dumb question, but can't you just turn multi-frame generation off and then you'd have a good apple-to-apples performance comparison? At least from what i've seen, the frame gen doesn't... seem to look that good? So comparing just in terms of raw power seems like a good way to see how much or little Nvidia GPUs are actually improving over the marketing fluff.
The third-party reviewers can. You can't trim out the feature from Nvidia's PR numbers that are the only thing available in advance of launch. You know, in advance of 98% of stock being sold out.
 
Upvote
4 (4 / 0)

TylerH

Ars Praefectus
3,874
Subscriptor
LLM's and GPU's are both just tools. You can build a house with a hammer, or bonk someone on the head with it.
This analogy would be more apt if hammers, by design, were all built using stolen labor and materials.

LLMs--at least all the ones of note--are inherently bad things because they're all based on massive IP theft orders of magnitude more than a single person could ever possibly achieve. Teenagers downloaded two songs on LimeWire and got sued for millions of dollars by the RIAA. Aaron Schwarz got threatened with ~35 years in prison for downloading a fraction of a fraction of the amount of content that even the smallest LLMs are based on (and he gave it back).

It's the same line of reasoning as "one death is a tragedy. A million deaths is a statistic".
 
Upvote
1 (5 / -4)

TylerH

Ars Praefectus
3,874
Subscriptor
Obviously, it remains to be seen whether the company and its partners can actually stock these cards at these prices. GPUs from the top-tier RTX 5090 to the mainstream RTX 5070 have been difficult to impossible to buy at their announced MSRPs.

A solution seems obvious to me, for both Nvidia themselves should they choose to sell directly to consumers, and marketplaces/resellers which presumably have bulk B2B agreements directly with Nvidia to get cards that aren't price scalped: institute a restriction of one card per checkout, and only one purchase per credit card or billing address or shipping address, for the first 6 months to a year after release. Throw in "per account", too, for sites that require you to have an account to buy stuff (which is most of them these days).

Of course, that requires Nvidia and those marketplaces/resellers to actually care about end users rather than just their bottom line.
 
Upvote
2 (2 / 0)
I would be surprised if the 5060 matched the 4070. It would be fairly pathetic if the 5060Ti 16GB can't beat it.

I wasn't expecting anything exciting from NV this generation, especially this far down the stack. I hope others aren't either.
relative-performance-2560-1440.png


Yeah.
 
Upvote
4 (4 / 0)

Jivejebus

Ars Scholae Palatinae
705
Subscriptor++
And yet will run the vast majority of the latest releases at high settings and 4K resolution - such as The Last Of Us Part 2.

If you are insulted by it, there are larger capacity cards available. I have no idea why people act like the base configuration is the only configuration.
Turn on ray tracing, you know the thing Nvidia cards are supposed to be better at and sold on, and you can see 8GB clearly isn't enough
1000003976.png1000003975.png
 
Upvote
7 (7 / 0)

fluctuationEM

Smack-Fu Master, in training
75
Subscriptor
In June 2023, two months after 4070 launch, I was able to buy one that had bundled Diablo 4 (lol) for $550 on Newegg. Despite all its flaws, I played about a hundred hours of Diablo 4.

In June 2025, two months after the 5060Ti launch, I probably won't be able to buy one for $550 (the Asus TUF has MRSP of $600). Despite being a newer generation, it's objectively slower than the 4070 while consume basically the same amount of power.

What a sad state of gaming GPU.
 
Upvote
2 (3 / -1)
The Steam Hardware Survey would beg to differ. There are almost as many 1060s in active use as there are 2060s, 3070s or 4070s.
Yeah, that's not really what the Steam Hardware Survey shows...
  • Total for all RTX series: ~55.1%
    • RTX 50 Series: 0.20% (March survey, so hardly any yet)
    • RTX 40 Series: ~24.4%
    • RTX 30 Series: ~22.1%
    • RTX 20 Series: ~8.4%
  • Total for all GTX series: ~27.8% (GTX 10 + GTX 16: ~21.0%)
    • GTX 16 Series: ~5.3%
    • GTX 10 Series: ~15.7%
    • Older GTX (900 series and older): ~6.8%
RTX 40 and RTX 30 each on their own are more than GTX 10 and GTX 16 combined.
 
Upvote
-1 (0 / -1)

zaghahzag

Ars Scholae Palatinae
741
Subscriptor++
The third-party reviewers can. You can't trim out the feature from Nvidia's PR numbers that are the only thing available in advance of launch. You know, in advance of 98% of stock being sold out.
The higher tier cards (5070, 5070ti, etc) show a very marginal improvement over the previous generation, especially if you take into account that they use more power.
The 5060 and 5060 ti will be the same thing.
 
Upvote
1 (1 / 0)

TylerH

Ars Praefectus
3,874
Subscriptor
Despite being a newer generation, [the 5060 Ti is] objectively slower than the 4070 while consume basically the same amount of power.
Well, yeah, it's a whole trim level lower (OK, half a trim level, since it's a Ti). But also no, the 4070 consumes 20W more power and is 1-2 levels of feature version older than what the 5xxx series offers (e.g. DLSS, etc.). And also no, the 5060 Ti has a Core Clock of 52MHz higher than the 4070. It has 4 GB more VRAM and an 8GHz faster effective memory pipeline than the 4070, too.

So, make up your mind, please. Are we talking objectively here or are we hand-waving with "basically" to get the argument we want? If you wanna compare benchmarks, OK, just do that instead of making inaccurate statements.
 
Upvote
-7 (0 / -7)

Jivejebus

Ars Scholae Palatinae
705
Subscriptor++
Well, yeah, it's a whole trim level lower (OK, half a trim level, since it's a Ti). But also no, the 4070 consumes 20W more power and is 1-2 levels of feature version older than what the 5xxx series offers (e.g. DLSS, etc.). And also no, the 5060 Ti has a Core Clock of 52MHz higher than the 4070. It has 4 GB more VRAM and an 8GHz faster effective memory pipeline than the 4070, too.

So, make up your mind, please. Are we talking objectively here or are we hand-waving with "basically" to get the argument we want? If you wanna compare benchmarks, OK, just do that instead of making inaccurate statements.
The only feature the 5060ti gets is DLSS multi fake frames, and that's while it loses support for 32 bit physx. And it's also objectively slower than the 4070. 20w is basically nothing when you're talking about a whole computer + monitor. In an alternate reality where Nvidia wasn't making truckloads of money selling AI chips this card would be a 5050 and under $200
 
Upvote
4 (5 / -1)
You seem to be backing me up that it's perfectly acceptable at "High" for 1440, let alone 1080. Also, please note that VRAM usage is not the same thing as VRAM requirement . Most, if not all, games will happily cache more data than they need if given the extra storage to put it in. There are plenty of examples of games that use more than 8GB on a 12GB card but will still run on 8.
Offloading data to the CPU RAM depending on an engine means either texture or mesh pop-in super late or stutters with low single digit frame rate.In either case it's a horrible gaming experience.
 
Upvote
4 (4 / 0)
The only feature the 5060ti gets is DLSS multi fake frames, and that's while it loses support for 32 bit physx. And it's also objectively slower than the 4070.
5060ti 16GB sits somewhere in the middle between 4070 and 4070 Super in raw performance.

5060ti 8GB is obviously slower than 4070, because at high settings it runs out of VRAM.

Whole thing is moot, because between tariffs and scalpers, it's not even remotely worth buying at retail price (MSRP is pure fiction).
 
Upvote
-4 (0 / -4)

Jivejebus

Ars Scholae Palatinae
705
Subscriptor++
5060ti 16GB sits somewhere in the middle between 4070 and 4070 Super in raw performance.

5060ti 8GB is obviously slower than 4070, because at high settings it runs out of VRAM.

Whole thing is moot, because between tariffs and scalpers, it's not even remotely worth buying at retail price (MSRP is pure fiction).
It sure as hell doesn't sit between the 4070 and 4070 super. It's slower than a 4070, just scroll up and take a look at the average fps performance graph from Techpowerup
 
Upvote
0 (2 / -2)
It sure as hell doesn't sit between the 4070 and 4070 super. It's slower than a 4070, just scroll up and take a look at the average fps performance graph from Techpowerup
You mean the performance graphs where you've intentionally omitted 4K? That's where 4070's falls flat on its face, because of it's lame 12GB VRAM.

1744828718786.png
 
Upvote
-4 (0 / -4)

Jivejebus

Ars Scholae Palatinae
705
Subscriptor++
You mean the performance graphs where you've intentionally omitted 4K? That's where 4070's falls flat on its face, because of it's lame 12GB VRAM.

View attachment 107733
Congratulations on finding the one game in the TPU test suite where the 5060ti is faster than the 4070. Literally the definition of cherry picking

ETA : Look the 7900XTX is faster than the 4090

1000003977.png
 
Upvote
4 (5 / -1)

Zillerzellerzoller

Seniorius Lurkius
28
Subscriptor
It's just sad at this point that the main demographic which powered NVIDIA for so long is now simply an afterthought to the bottom line for data centers and AI R&D.

I hope Intel continues investing in their ARC platform, and AMD has quite an opening now to disrupt. We desperately need competition and I hope other companies realize they don't want to be beholden to a single supplier for critical components.

I guess overall these past few years, tech doesn't feel as fun as it used to.
 
Upvote
3 (4 / -1)

Demento

Ars Legatus Legionis
14,499
Subscriptor
Offloading data to the CPU RAM depending on an engine means either texture or mesh pop-in super late or stutters with low single digit frame rate.In either case it's a horrible gaming experience.
Depends on the game there. Poorly written, yeah you turn a setting or two down a notch to fix it. Well written games handle it with aplomb. Cyberpunk is actually really good at it. Hell, I can even turn on some RT options and not have it stutter or do pop in. That kind of surprised me, you wouldn't think a 3060Ti had a lot of RT oomph to it.
 
Upvote
-2 (1 / -3)