Review: Nvidia’s GeForce RTX 5090 is the first GPU that can beat the RTX 4090

AltoClefScience

Ars Centurion
288
Subscriptor
So the follow up to the 4090 the 5090 has better performance then its predecessor? The article was informative but the title needs work.

It’s essentially saying water is wet.
Seriously. Talking about damning with faint praise... In case the a/b headline testing changes things, we're both reacting to this headline:

Nvidia’s GeForce RTX 5090 is the first GPU that can beat the RTX 4090"​

 
Upvote
73 (80 / -7)

Alyeska

Ars Tribunus Angusticlavius
6,598
Subscriptor++
So the follow up to the 4090 the 5090 has better performance then its predecessor? The article was informative but the title needs work.

It’s essentially saying water is wet.

I was just about to post an almost identical comment. To quote The Joker, very poor choice of words.
 
Upvote
-4 (14 / -18)

Embattle

Ars Scholae Palatinae
1,447
It is unquestionably a great performance card in every way we would of expected such as raw performance, it is also undoubtedly bad in all the ways we expected in terms of price and power consumption.

I've an 7900xtx and it is still far exceeding my requirements, not that I would spend nearly double for the 5900.
 
Upvote
53 (56 / -3)
Post content hidden for low score. Show…

Embattle

Ars Scholae Palatinae
1,447
So the follow up to the 4090 the 5090 has better performance then its predecessor? The article was informative but the title needs work.

It’s essentially saying water is wet.

There really isn't a great deal more that could be said in my opinion, unless somehow it turned out slower.
 
Upvote
54 (54 / 0)

HiroTheProtagonist

Ars Praefectus
5,822
Subscriptor++
It is unquestionably a great performance card in every way we would of expected such as raw performance, it is also undoubtedly bad in all the ways we expected in terms of price and power consumption.

I've an 7900xtx and it is still far exceeding my requirements, not that I would spend nearly double for the 5900.
The only thing that has me even thinking about the 50xx series is potential raytracing gains. My 3080Ti is still plenty powerful for 4K/120fps, but the few times I've tried running raytracing at those resolutions it feels like I may as well just using the integrated graphics on my CPU.

So it is basically a SLI version of a 4090.. I am assuming that is why they got rid of SLI, too easy for people to upscale performance themselves.
SLI dying was more a result of developers not wanting to optimize games for it since it added more overhead, and while there were performance gains to be had, it often wasn't as effective as just buying a better single GPU.
 
Upvote
85 (85 / 0)
The only thing that has me even thinking about the 50xx series is potential raytracing gains. My 3080Ti is still plenty powerful for 4K/120fps, but the few times I've tried running raytracing at those resolutions it feels like I may as well just using the integrated graphics on my CPU.


SLI dying was more a result of developers not wanting to optimize games for it since it added more overhead, and while there were performance gains to be had, it often wasn't as effective as just buying a better single GPU.
It also makes things like TAA very hard to do since you need access to to the previous frames but in SLI most of the time one gpu would have one frame and another gpu would be drawing the other.
 
Upvote
49 (49 / 0)
Post content hidden for low score. Show…
Post content hidden for low score. Show…

Dzov

Ars Legatus Legionis
14,704
Subscriptor++
What the hell is the point of a card that's 30% faster while costing 30% more and using 30% more power?
It's faster. People used to pay a huge premium for even less additional speed. If you don't need it, don't buy it. I'm looking forward to the 5080 series myself.
 
Upvote
97 (111 / -14)

zealotpewpewpew

Wise, Aged Ars Veteran
152
SLI dying was more a result of developers not wanting to optimize games for it since it added more overhead, and while there were performance gains to be had, it often wasn't as effective as just buying a better single GPU.
More than just that, once reviewers started benchmarking more than just average FPS, it became apparent that SLI's gains came at the cost of latency and consistency. So, not only was it rarely worth it in practice to buy a second old card instead of one new card, and not only did it require some level of developer support to work well, but it was just functionally worse than one fast GPU.

Which really makes me sad because I loved the ridiculous excess of quad-GPU katamaris all but belching flames out the exhaust, but, now that I could afford such a thing, there's no reason for it. My money-is-no-object rig of pointless excess has a single 4090 in it.
 
Upvote
118 (118 / 0)

accantant

Ars Centurion
314
Subscriptor++
I'm never, ever, ever going to spend $2k on a GPU but, damn if that isn't a good looking card.
Yeah, it's a fascinating piece of engineering that I have no intention to ever buy.
I saw a deep-dive on how the cooler and board were designed, and it told me Nvidia does at least spend some of its enormous piles of cash on great engineers.
 
Upvote
88 (89 / -1)

Marcus Andreus

Ars Scholae Palatinae
834
Subscriptor
So the follow up to the 4090 the 5090 has better performance then its predecessor? The article was informative but the title needs work.

It’s essentially saying water is wet.
I mean, given what we saw with CPUs last year, specifying that it is fact notably faster than its predecessor isn't totally irrelevant.

That said, it really feels like the 5080 will be a much better value, as much as there is value at this performance level. Looking at the various reviews, I can't conceive of a way that the 5090 could be twice as fast as a 5080. Hell, It's not twice as fast as a 4080.
 
Upvote
48 (48 / 0)

dmsilev

Ars Tribunus Angusticlavius
6,128
Subscriptor
It’s kind of crazy how much larger this GPU is than the 4090 (in core count, area, and power budget) compared to its uplift. They must be hitting some sort of scalability wall in their design.
Seems to be power limited. 25-30% more performance, 25-30% more power consumption. I vaguely recall reading that both generations were made on the same TSMC process node, so I guess that shouldn’t be too much of a shock.
 
Upvote
32 (32 / 0)
So the follow up to the 4090 the 5090 has better performance then its predecessor? The article was informative but the title needs work.

It’s essentially saying water is wet.

Seems to be power limited. 25-30% more performance, 25-30% more power consumption. I vaguely recall reading that both generations were made on the same TSMC process node, so I guess that shouldn’t be too much of a shock.


Other reviews shows that if you under volt it, you can get basically about 20-25% improvement over 4090 in both raster/rt performance with the same power draw.

With other "oc" reviews, they bump up the power by 5-10% and hits 600+ power draw and it barely gets 1-2% improvement in performance...

It's more CPU limited at 1440p and below than anything else from all the reviews...
 
Last edited:
Upvote
45 (46 / -1)
Post content hidden for low score. Show…

bernstein

Ars Scholae Palatinae
643
So DLSS FG can be useful for turning 120 fps into 240 fps, or even 60 fps into 120 fps. But it's not as helpful if you're trying to get from 20 or 30 fps up to a smooth 60 fps.
In other words: FG (Frame Gen) is only suitable for HFR-Gaming (HighFrameRate).

Useless for Gamers who never felt the need to upgrade their Gaming from Consoles & Devices with iGPUs (Like Laptops, Handhelds & most Laptops). This includes every Version of the non-Pro iPhone, iPad and almost all TVs & Monitors sold.
 
Upvote
1 (8 / -7)
I agree that Nvidia has used MFG to be deceptive about the performance of the new series. And I know a lot of gamers are complaining about fake frames. But if you're not a professional competitive gamer and input lag isn't super important, then MFG shouldn't be too worrisome, and, in fact, holds some promise for a lot of gamers.

MFG is something that needs to be negotiated: adjust your setting so that you're getting a decent traditional game-engine-rendered framerate that meets your needs in terms of low latency, then turn on DLSS 4 to whichever interpolated additional frames you want. I think it's a great option.

I can also see the writing on the wall here. This is going to be the future. If you insist on a zillion fps, then AI can get you there; and Nvidia can provide it. And future titles are going to support it.

I think it's impressive tech.
 
Upvote
11 (21 / -10)
In abstract it looks impressive, but OTOH it's got 30% more cores so it would be a shock if it didn't perform 30% better and drink 30% more power. I mean, other than the frame generation stuff, you'd expect the same performance from a hypothetical RTX4095 if it had 21760 cores.

I feel like the most impressive advancement in this card isn't the graphics performance, it's the ability to handle more heat with a smaller cooler compared to the previous generation.
 
Upvote
63 (68 / -5)

Dzov

Ars Legatus Legionis
14,704
Subscriptor++
For anyone curious, Der8auer has posted a great preliminary look at undervolting, seeing some pretty strong efficiency gains. Running the card at a 70-80% power envelope barely touches the performance in some titles. I'd just do that by default, honestly.
This is good to know. I remember people doing that with the 3090 to great effect.
 
Upvote
11 (11 / 0)

zealotpewpewpew

Wise, Aged Ars Veteran
152
I can also see the writing on the wall here. This is going to be the future. If you insist on a zillion fps, then AI can get you there; and Nvidia can provide it. And future titles are going to support it.

I think it's impressive tech.
Big disagree. It is the future, which all but guarantees future titles will take it for granted and require it to hit their frame-rate targets, which means the future of gaming is a laggy 15fps with a glorified motion-smoothing filter.

In the short term, you're right, but it blows the ceiling off of how poorly a game can be optimized and still ship, and "just barely good enough to ship" is when publishers ship their games.
 
Upvote
54 (64 / -10)

zealotpewpewpew

Wise, Aged Ars Veteran
152
I am thinking this card is less about rich gamers, and more about enterprise AI data centers... OpenAI, Google, Meta, X, Tesla, etc.

For them, money is object and performance is king, right?

On that note, as these AI companies upgrade, are they offloading lots of good GPUs from a year or two ago, cheap, in the secondary market?
These companies use racks of liquid-cooled specialty GPUs at something like $80,000 apiece, six to a server, with the gobs of VRAM needed to run commercial models. They're too expensive for gamers even used, and the main reason NVIDIA constrains the VRAM on their gaming cards is to keep them from cannibalizing their commercial AI market.
 
Upvote
77 (77 / 0)

Kjella

Ars Tribunus Militum
1,988
SLI dying was more a result of developers not wanting to optimize games for it since it added more overhead, and while there were performance gains to be had, it often wasn't as effective as just buying a better single GPU.
Not to mention less stable. I had 2x GTX 970 in SLI and eventually just sold one because even the games that supported it tended to crash more and that was by far more annoying than poor frame rate, particularly when it ruined a multiplayer game. I only tried that one config so YMMV but I consider that purchase a mistake.
 
Upvote
32 (32 / 0)