ATI cares, they were just caught off guard by how badly the nvidia 50xx releases have gone.After months of frustration trying to get the Intel ARC B580 at MSRP (or at all), I finally tapped out and bought a last-gen gaming laptop that was on a close-out sale. It feels like GPU vendors just don't care about stock or prices these days.
Greed. nVidia wants to sell as many AI-cards as they can for the most money they can. But nVidia fears that a 5060 with more than 8GB of RAM becomes "enough useful" to AI customers that instead of spending $$$$ on AI-cards, they only spend $$$ on video cards. So to maximise profit in a segmented market they cripple the video card; and gamers just have to put up with it because video cards is $100M/year and AI is $1B-10B/year.It is insane to me that Nvidia continues to be so stingy with VRAM.
I would’ve thought the bare minimum, lowest end cards would at least be at 12GB, but even that seems a bit too low for 2025. 16GB seems more reasonable.
AMD doesn’t seem to be as stingy (though they did launch 8GB cards last gen too), so would it be a supply chain constraint? Outside of artificially limiting these cards capabilities to encourage people to upgrade sooner, I fail to understand what the deal is with sticking to 8GB.
Preposterous! I can go to my local shop and purchase an 8GB thumb drive for a button and length of wire! Nvidia are fleecing us! I blame Roger Dell for this outrageous pricing!...RTX 5060 Ti has 8GB or 16GB for $379 and $429.
The difference is that AMD targetted the performance of the 8GB cards to 1080p class while Nvidia 8GB cards are in the 1440p performance class where the VRAM matters more.It is insane to me that Nvidia continues to be so stingy with VRAM.
I would’ve thought the bare minimum, lowest end cards would at least be at 12GB, but even that seems a bit too low for 2025. 16GB seems more reasonable.
AMD doesn’t seem to be as stingy (though they did launch 8GB cards last gen too), so would it be a supply chain constraint? Outside of artificially limiting these cards capabilities to encourage people to upgrade sooner, I fail to understand what the deal is with sticking to 8GB.
Sure you could buy AMD, IF all you do is gaming and that's it. But good luck with that if you change your mind later and want compute support that Just Works. I'm not even talking about LLM BS, just basic compute support if you want to try a new Blender recipe, ignite a pyTorch, play with the Hashcat, and so on.
AMD has effectively ceded the mid to lower end compute market to Nvidia. Intel's OneAPI would be viable if you could get the hardware, which you can't usually. AMD is now half assing it trying to scramble for ROCm/HIP support after finally half way listening to what people have been trying to tell them for years: the RX 9070 still doesn't have full ROCm/HIP support on par with CUDA despite the supposed outreach efforts and may or may not ever get official full support. (FWIW, Windows has very little presence outside of Blender in the HPC realm even as a workstation. If it's not a Linux compute station you're leaving a lot of performance on the table.)
Right now the only game in town is still Nvidia & CUDA that you might be able to get your paws on. Intel is a no show, and AMD can't seem to get it right.
I share your disdain for these companies putting out 8GB cards, btw. The absolute bare minimum at this point should be nothing less than 12GB VRAM, preferably 16.
I'm to the point where I'm annoyed with the PC market when it comes to actually getting hardware in hand that I"m seriously looking at Macs since they have very good seamless compute/rendering support through Metal. Once you add in the premium on scalped hardware you often come out better with a Mac Mini.
You seem to be backing me up that it's perfectly acceptable at "High" for 1440, let alone 1080. Also, please note that VRAM usage is not the same thing as VRAM requirement . Most, if not all, games will happily cache more data than they need if given the extra storage to put it in. There are plenty of examples of games that use more than 8GB on a 12GB card but will still run on 8.Depends of what you find "acceptable". A lot of games run fine at 8GB as long as you have pure raster at mid settings, no DLSS FG, no RT. So as long as you disable features, that are Nvidia's biggest selling points. With these enabled, you won't have enough VRAM even for FHD gameplay.
View attachment 107688View attachment 107689
ATI cares, they were just caught off guard by how badly the nvidia 50xx releases have gone.
Had they known what an opportunity they'd have then maybe they could've locked in more GPU production with TSMC 6-12 months ahead of time.
I think context will be an important factor to folks looking at buying a new video card. So the real question will be compared to what and at what actual cost?This may actually be damning to anyone willing to read through the bullshit. With MFG vs single frame gen, and even with the increased core count and faster memory, and with hand-picked benchmark results, they only manage to double framerate? I'm betting actual performance improvements will be very lackluster.
As with its other 50-series announcements, Nvidia is leaning on its DLSS Multi-Frame Generation technology to make lofty performance claims—the GPUs can insert up to three AI-interpolated frames in between each pair of frames that the GPU actually renders. The 40 series could only generate a single frame, and 30-series and older GPUs don't support DLSS Frame Generation at all. This makes apples-to-apples performance comparisons difficult.
I feel the same about my 1080Ti FE. But, I really want to put that in a second PC now, so last week I managed to snag a Sapphire Nitro+ RX9070XT for less than I paid for my 1080Ti 8 years ago. It's an overclocked card so I wasn't expecting to be MSRP, but it wasn't much above MRSP. In fact, the price has gone up £85 since I placed my order. Just got to wait until next month for delivery.I've yet to feel any urge to replace my 1060 personally, it already runs basically everything.
Ho you triggered the nerdsI'm not sure how I feel anymore about NV. They are a heavy supporter/enabler of LLMs and LLMs seem to be increasingly used by fascist governments for propaganda and oppression.
Its really odd. I've been in the field for years, and GPUs are probably my favorite computer HW. But it is getting harder to keep seeing them as a toy as they are weaponized.
This is the way, imo. I just bought Doom 2016 for £1.99. Looking forward to playing it at 60 FPS on Ultra with my 1060 potato.
After that maybe Alien: Isolation (£3.49).
The third-party reviewers can. You can't trim out the feature from Nvidia's PR numbers that are the only thing available in advance of launch. You know, in advance of 98% of stock being sold out.This is a really dumb question, but can't you just turn multi-frame generation off and then you'd have a good apple-to-apples performance comparison? At least from what i've seen, the frame gen doesn't... seem to look that good? So comparing just in terms of raw power seems like a good way to see how much or little Nvidia GPUs are actually improving over the marketing fluff.
This analogy would be more apt if hammers, by design, were all built using stolen labor and materials.LLM's and GPU's are both just tools. You can build a house with a hammer, or bonk someone on the head with it.
Pricing? lol. Stock? Not anymore! Low-RAM versions? Not reviewed!I’ll believe those prices when I see it.
I would be surprised if the 5060 matched the 4070. It would be fairly pathetic if the 5060Ti 16GB can't beat it.
I wasn't expecting anything exciting from NV this generation, especially this far down the stack. I hope others aren't either.
About what I expected.![]()
Yeah.
Turn on ray tracing, you know the thing Nvidia cards are supposed to be better at and sold on, and you can see 8GB clearly isn't enoughAnd yet will run the vast majority of the latest releases at high settings and 4K resolution - such as The Last Of Us Part 2.
If you are insulted by it, there are larger capacity cards available. I have no idea why people act like the base configuration is the only configuration.
Yeah, that's not really what the Steam Hardware Survey shows...The Steam Hardware Survey would beg to differ. There are almost as many 1060s in active use as there are 2060s, 3070s or 4070s.
The higher tier cards (5070, 5070ti, etc) show a very marginal improvement over the previous generation, especially if you take into account that they use more power.The third-party reviewers can. You can't trim out the feature from Nvidia's PR numbers that are the only thing available in advance of launch. You know, in advance of 98% of stock being sold out.
Well, yeah, it's a whole trim level lower (OK, half a trim level, since it's a Ti). But also no, the 4070 consumes 20W more power and is 1-2 levels of feature version older than what the 5xxx series offers (e.g. DLSS, etc.). And also no, the 5060 Ti has a Core Clock of 52MHz higher than the 4070. It has 4 GB more VRAM and an 8GHz faster effective memory pipeline than the 4070, too.Despite being a newer generation, [the 5060 Ti is] objectively slower than the 4070 while consume basically the same amount of power.
The only feature the 5060ti gets is DLSS multi fake frames, and that's while it loses support for 32 bit physx. And it's also objectively slower than the 4070. 20w is basically nothing when you're talking about a whole computer + monitor. In an alternate reality where Nvidia wasn't making truckloads of money selling AI chips this card would be a 5050 and under $200Well, yeah, it's a whole trim level lower (OK, half a trim level, since it's a Ti). But also no, the 4070 consumes 20W more power and is 1-2 levels of feature version older than what the 5xxx series offers (e.g. DLSS, etc.). And also no, the 5060 Ti has a Core Clock of 52MHz higher than the 4070. It has 4 GB more VRAM and an 8GHz faster effective memory pipeline than the 4070, too.
So, make up your mind, please. Are we talking objectively here or are we hand-waving with "basically" to get the argument we want? If you wanna compare benchmarks, OK, just do that instead of making inaccurate statements.
Offloading data to the CPU RAM depending on an engine means either texture or mesh pop-in super late or stutters with low single digit frame rate.In either case it's a horrible gaming experience.You seem to be backing me up that it's perfectly acceptable at "High" for 1440, let alone 1080. Also, please note that VRAM usage is not the same thing as VRAM requirement . Most, if not all, games will happily cache more data than they need if given the extra storage to put it in. There are plenty of examples of games that use more than 8GB on a 12GB card but will still run on 8.
5060ti 16GB sits somewhere in the middle between 4070 and 4070 Super in raw performance.The only feature the 5060ti gets is DLSS multi fake frames, and that's while it loses support for 32 bit physx. And it's also objectively slower than the 4070.
It sure as hell doesn't sit between the 4070 and 4070 super. It's slower than a 4070, just scroll up and take a look at the average fps performance graph from Techpowerup5060ti 16GB sits somewhere in the middle between 4070 and 4070 Super in raw performance.
5060ti 8GB is obviously slower than 4070, because at high settings it runs out of VRAM.
Whole thing is moot, because between tariffs and scalpers, it's not even remotely worth buying at retail price (MSRP is pure fiction).
You mean the performance graphs where you've intentionally omitted 4K? That's where 4070's falls flat on its face, because of it's lame 12GB VRAM.It sure as hell doesn't sit between the 4070 and 4070 super. It's slower than a 4070, just scroll up and take a look at the average fps performance graph from Techpowerup
Congratulations on finding the one game in the TPU test suite where the 5060ti is faster than the 4070. Literally the definition of cherry pickingYou mean the performance graphs where you've intentionally omitted 4K? That's where 4070's falls flat on its face, because of it's lame 12GB VRAM.
View attachment 107733
You mean the performance graphs where you've intentionally omitted 4K? That's where 4070's falls flat on its face, because of it's lame 12GB VRAM.
View attachment 107733
Depends on the game there. Poorly written, yeah you turn a setting or two down a notch to fix it. Well written games handle it with aplomb. Cyberpunk is actually really good at it. Hell, I can even turn on some RT options and not have it stutter or do pop in. That kind of surprised me, you wouldn't think a 3060Ti had a lot of RT oomph to it.Offloading data to the CPU RAM depending on an engine means either texture or mesh pop-in super late or stutters with low single digit frame rate.In either case it's a horrible gaming experience.
Go 9060. Nvidia needs to be taught a lesson about this. They literally rolled out garbage. with this Gen. AMD is actually trying with RDNA 4.Might be time to replace this 2060 Super... Ok, given the downvotes, maybe I'll wait for the 6060.
And there’s a plonk.fake frames