Review: Nvidia’s GeForce RTX 5090 is the first GPU that can beat the RTX 4090

OptimusP83

Ars Praefectus
3,663
Subscriptor
The last Nvidia I got was the 3080, the prices are just insane for the 4090 and now 5090. I'm super happy with my AMD 7900XTX I got a year ago.
I think the last Nvidia card I bought was an 8800gt that I put into my Mac Pro that replaced the super glitchy 1900x. I later replace the 8800gt with a Radeon 4870.

I had a GTX 970 which was great, but it was a loaner from my brother So I don’t buy it and I gave it back to him after a short while.

All this is to say, I’m just not a person who sees enough value in Nvidia’s software features to pay such a hefty premium for their cards. AMD cards are not only competitive, but beat Nvidia’s cards at a similar price point in everything but RT performance, and they’ve been closing the gap quickly as of late. RT is cool and I’m not nearly as dismissive of it as I was 3-4 years ago, but it’s sorta just not there yet. 3-4 years ago it was a great marketing feature but in practice it offered minimal visual increase or absolutely tanked performance. Nvidia’s brazen disregard for gaming consumers when it comes to speccing their cards with adequate RAM, and then double down on it by pointing to DLSS and saying it’s the solution, really rubbed me the wrong way. I really dislike Nvidia as a company, and while that factored into my decision to buy a 7800XT recently, the main reason was because Nvidia cards offered a horrendous value with kneecapped VRAM specs.
 
Upvote
-1 (2 / -3)

Voldenuit

Ars Tribunus Angusticlavius
6,550
i think the 5090 is going to be a failure in sales

500$ more for 30% more performance?

this would have been great retained 4090 it a 2 year old gpu

nvidia only can do this cuz they no competiton but i will skip them
I think the 5090 is going to sell out, partly because there are users who have to have the very best, but also because I honestly don't think they are making a lot of them (for consumer use). 750 mm^2 dies are no joke (reticle limits for EUV are estimated to be 850 mm^2).

How well the 5080 and 5070 do will pretty much boil down to marketing. The 5080 is a negligible increase in transistor count over the 4080 Super, and performance estimates have not been stellar. Leaked benchmarks right now are saying 14% faster than 4080 in 3DMark, 8% faster in Vulkan, 9% faster in OpenCL, about 8-10% faster in Blender.

The 5070 should be maybe 30% faster than the 4070, but it's disingenuous for nvidia to compare it to that instead of the 4070 Super, which was 16% faster than the original 4070, which many reviewers considered a disappointment at launch. This would make the 5070 only 12% faster than a 4070 Super.

Nvidia is focusing heavily on MFG framerates for the 5000 series in their marketing and hype, but MFG is not a solution for latency or image quality*. It's great for increasing image clarity on high refresh monitors (and in 2025, high refresh means 240 Hz or 480 Hz... not exactly commonplace setups), but it's a latency hit even with Reflex 2. nvidia heavily marketing "fake frames" as "real frames" is targeted at the low information consumers who do not understand the nuances of the technology, and it will probably be successful, going by history. I've seen plenty of people claiming they will sell their 4080 to get a 5070 because "it will be as fast as a 4090".

Even outside of the people who buy into the marketing, nvidia is the only game in town for upper end cards. 7900XTX performance is more inconsistent (sometimes faster, sometimes slower than a 4080) and draws a lot more power, and the Radon 9070 XT is nowhere to be seen before March. 7000 Radeon cards also won't support FSR4, so you'd be buying something that AMD is obsoleting. If you're sitting on a mid to high end 1000, 2000 or 3000 series card and looking to upgrade to the latest generation, nvidia is the only option.

*EDIT: It's also worth mentioning that DLSS4 MFG is currently only available on whitelisted games. So even if your favorite game supports DLSS FG, there's no guarantee that nvidia has whitelisted it (yet)
 
Last edited:
Upvote
3 (3 / 0)
Seriously. I have a 4k and a 1440p monitor, both 27", sitting next to each other at the same viewing distance of ~90cm, so I get plenty of opportunities to compare how games look across them. Honestly, it's very difficult to justify the performance cost of 4k given how good 1440p with DLAA looks.

I did plan to potentially upgrade my RTX 3070 this generation, but it's largely because of how much of an issue 8GB of VRAM is starting to be in some games and also because getting at least 120fps is a noticeably better experience. I am a bit curious though whether getting frame generation on the RTX 3-series will convince me to wait till 2026 or beyond...
Maybe at 27" (I don't really believe you) but there are much larger screen sizes where 4k is desirable
 
Upvote
1 (1 / 0)
For anyone curious, Der8auer has posted a great preliminary look at undervolting, seeing some pretty strong efficiency gains. Running the card at a 70-80% power envelope barely touches the performance in some titles. I'd just do that by default, honestly.

I have an AMD card (6950XT) so probably different experience, but I just put it in quiet mode instead because I found the process of finding a decent undervolt infuriating. I gave up when it turned out that my 100% Really Heavy Game stable undervolt was not 2 hours in paint.net with hardware acceleration enabled stable and the driver crashed and I lost those two hours of work.
 
Upvote
1 (1 / 0)

Numfuddle

Ars Tribunus Militum
2,207
Subscriptor
On top of the $2K price of the card, don't forget to upgrade to at least a 1Kw PSU (and that might not be enough . . . so get a 1200-watt PSU, and hope you can find a 1500-watt soon).
The PSU Configurators for most of the PSU brands recommend a 1000W PSU for the 5080 and a 1200W PSU for the 5090
 
Upvote
1 (1 / 0)

Hresna

Wise, Aged Ars Veteran
157
Subscriptor++
For anyone curious, Der8auer has posted a great preliminary look at undervolting, seeing some pretty strong efficiency gains. Running the card at a 70-80% power envelope barely touches the performance in some titles. I'd just do that by default, honestly.
I saw the video a few days ago. I was surprised he seems only to try power-limiting the card, and not actually undervolting it by controlling the offset vf curve in any way. I wondered if it’s because it’s not enabled in the drivers yet. There would surely be some additional efficiencies gainable by capping power using that method, as previous nvidia cards have all had.
 
Upvote
3 (3 / 0)
Not really. If you have a 360hz monitor (I have) and you want to play games maxxed and with more than 200 FPS, this is the card (I won't buy because I already got spent big on last gen).

But it would be AMAZING to play current triple A games on a good monitor with this card.

Alas, the human eye cannot see beyond 30 fps.
Maybe consider changing your ophthalmologist.
 
Upvote
1 (1 / 0)
The price is still laughable, but cheaper than I expected. It's important to note that gamers are not the primary customer of Nvidia now.

The thing about Nvidia GPU's is that the design paradigm has shifted. Years ago they'd design a rendering-focused (gaming) card and then spin off business-focused models (Quadro), which were the same cards just without the fancy design, the locked-in drivers (for apps they paid to require them), and steep mark-up (had to cover those app lock-in payments -- I remember when Autodesk software would be very unstable if you didn't run their business card).

Now it's about business / AI-first and then they're finding software ways for gamers to benefit from that hardware -- starting with the RTX series.

RayTracing (RT) cores are optimized for massive parallelism and high throughout (something LLM/Transformers like), Tensor Cores are optimized for high throughout and low latency (but less precision). These don't necessarily benefit gaming compared to the General Compute (CUDA) cores, so they found a way to sell it via their software API's and post-processing tricks.

This is also why Blackwell seems underwhelming compared to Ada for the gaming side, the upgrades done were for the ML side (high bandwidth, lots of VRAM capacity potential).

Hence why they want so much for their cards and they really couldn't care much about gamers .. why sell a chip run for $800/ea when they can get $5k from the business side.
 
Upvote
6 (6 / 0)
Techspot measured 698W at the PCIE slot and 12VHPWR connector in Cyberpunk 2077 at 4K.
That was 698W for the GPU and CPU (the graphs are labeled PCIe + EPS, the CPU power connector). Article also mentions "Note that this data combines both CPU and GPU power usage, as GeForce GPUs are known to increase CPU load in certain scenarios, which can reduce GPU load and, in turn, lower power consumption."

Even so, the worst CPU+GPU totals are still almost within spec for just the GPU connectors. 12VHPWR is good for 600W, and you can draw 75W from the PCIe slot, so what's 23W between friends? NVIDIA should really just slap a CPU socket on there and give us access to 32GB of unified system memory.
 
Upvote
2 (2 / 0)

Voldenuit

Ars Tribunus Angusticlavius
6,550
That was 698W for the GPU and CPU (the graphs are labeled PCIe + EPS, the CPU power connector). Article also mentions "Note that this data combines both CPU and GPU power usage, as GeForce GPUs are known to increase CPU load in certain scenarios, which can reduce GPU load and, in turn, lower power consumption."

Even so, the worst CPU+GPU totals are still almost within spec for just the GPU connectors. 12VHPWR is good for 600W, and you can draw 75W from the PCIe slot, so what's 23W between friends? NVIDIA should really just slap a CPU socket on there and give us access to 32GB of unified system memory.
Ah, thanks for the catch, I've been so used to GN measuring card power with shunts that I just assumed that anything not labeled 'Total System Power' in a GPU review would have been GPU-only.

igorslab does separate out GPU vs System power, and the GPU itself doesn't exceed 520W sustained power in their tests.
 
Upvote
2 (2 / 0)

chalex

Ars Legatus Legionis
11,586
Subscriptor++
Ah, thanks for the catch, I've been so used to GN measuring card power with shunts that I just assumed that anything not labeled 'Total System Power' in a GPU review would have been GPU-only.

igorslab does separate out GPU vs System power, and the GPU itself doesn't exceed 520W sustained power in their tests.
So basically we are in the same place we have been since 3090 TI which was also just about 520W max. Nvidia officially released the GeForce RTX 3090 Ti on March 29, 2022.
 
Upvote
1 (1 / 0)
So basically we are in the same place we have been since 3090 TI which was also just about 520W max. Nvidia officially released the GeForce RTX 3090 Ti on March 29, 2022.
I haven't checked the 3090 Ti comparison specifically, but the general consensus has been ~30% power for ~30% perf in RT and rasterization when technologies like FG or MFG are not enabled.

With MFG enabled, Nvidia can claim a substantial efficiency boost over the 4000 Series.
 
Upvote
1 (1 / 0)

Cryxx

Wise, Aged Ars Veteran
195
Put this in a room, play for 2 hrs and it'll raise the room temp a good 5-10F.

Space heaters start around 750Watts. Add in the CPU heat also yeah you have something kicking the temps up. Which over time will raise the running temp, because the room is warmer.

If your are going to draw more power, I'd hope you get more performance.


I'd love to see the same power draw but more performance, but we won't.
 
Upvote
1 (1 / 0)

Distraction

Ars Centurion
226
Subscriptor
The fact that they're pushing AI interpolated frames as the killer feature on a $2k GPU is ridiculous to me. The primary reason for a higher framerate is increased 'smoothness'. Throw a bunch of fake frames in, you increase input latency, which reduces overall smoothness. The claim is that at a high enough framerate, extra frames are worth the additional latency, but if you're unable to feel that latency, then I'd expect you're past the point of diminishing returns on higher framerates, anyway.

As far as VR goes, isn't latency one of the biggest contributors to motion sickness? So, wouldn't using this make things worse?

This seems aimed at impressing people with a larger FPS number, while ignoring the actual benefits that are supposed to come along with that.
 
Upvote
2 (2 / 0)

Flipper35

Ars Tribunus Militum
2,528
Heh. Wondering now - DLSS and frame generation add artifacts, would those be worse if you used this thing for VR? I remember VR had trouble pushing the needed amount of frames at the right res even with the biggest, baddest cards running tandem.

Disclaimer: I don't even own games that are VR-compatible, everything secondhand with me.
Started VR with an Index and a 1660 Super. Most games were playable, but FS2020 was not. 3060Ti runs everything without issue now after upgrading when it came out. Most of what I do are flight sims, party games like Keep Talking, First Person or puzzle games like I Expect You to Die. FS2020 is by far the hardest on the system. Phasmaphobia is much better in VR!
 
Upvote
0 (0 / 0)
And now we get these effects from DLSS and Frame Generation. Progress!
Have you played some modern games without DLSS? Like, Indiana Jones? NPC faces are blurry, DLSS or no DLSS. It's just that modern engines do all shit in screen space, so proper antialiasing (MSAA) is not really an option, leaving us with the shit that we have.
 
Upvote
1 (1 / 0)

Voldenuit

Ars Tribunus Angusticlavius
6,550
I'm glad the economy sucks so bad that $2k graphics cards are an option

A recent comment from a techtuber stuck in my head. He said, "GPUs are so bad right now, intel has caught up."

OK that is a harsh statement, and glosses over a lot of the technical achievements and advancements in recent years, but it certainly captures the sentiment of a lot of people that the value add of graphics cards aren't what they used to be.
 
Upvote
0 (0 / 0)

bernstein

Ars Scholae Palatinae
645
I don't feel that Valve has that much interest in keeping games off of Steam due to performance. Many games already have generic system requirements or jokes like "CPU: le potato".

The only requirements for publishing on Steam these days are 1.) Do you have $100? 2.) Do you have an EXE? 3.) Is it a virus?
True. But they make it a requirement to very prominently point out how shit your games are if you require either a 3rd-Party Account, 3rd Party DRM, Kernel Level Anti-Cheat and to a lesser extent if you disallow Family-Sharing, RemotePlayTogether & aren't SteamDeck Verified/Playable

Personally if you require/do any of these i will not buy (don't really care about kernel-level anticheat but has to be steam deck playable/verified)
 
Upvote
2 (2 / 0)