Even setting aside Frame Generation, this is a seriously fast, power-hungry GPU.
See full article...
See full article...
I think the last Nvidia card I bought was an 8800gt that I put into my Mac Pro that replaced the super glitchy 1900x. I later replace the 8800gt with a Radeon 4870.The last Nvidia I got was the 3080, the prices are just insane for the 4090 and now 5090. I'm super happy with my AMD 7900XTX I got a year ago.
I think the 5090 is going to sell out, partly because there are users who have to have the very best, but also because I honestly don't think they are making a lot of them (for consumer use). 750 mm^2 dies are no joke (reticle limits for EUV are estimated to be 850 mm^2).i think the 5090 is going to be a failure in sales
500$ more for 30% more performance?
this would have been great retained 4090 it a 2 year old gpu
nvidia only can do this cuz they no competiton but i will skip them
Maybe at 27" (I don't really believe you) but there are much larger screen sizes where 4k is desirableSeriously. I have a 4k and a 1440p monitor, both 27", sitting next to each other at the same viewing distance of ~90cm, so I get plenty of opportunities to compare how games look across them. Honestly, it's very difficult to justify the performance cost of 4k given how good 1440p with DLAA looks.
I did plan to potentially upgrade my RTX 3070 this generation, but it's largely because of how much of an issue 8GB of VRAM is starting to be in some games and also because getting at least 120fps is a noticeably better experience. I am a bit curious though whether getting frame generation on the RTX 3-series will convince me to wait till 2026 or beyond...
For anyone curious, Der8auer has posted a great preliminary look at undervolting, seeing some pretty strong efficiency gains. Running the card at a 70-80% power envelope barely touches the performance in some titles. I'd just do that by default, honestly.
The PSU Configurators for most of the PSU brands recommend a 1000W PSU for the 5080 and a 1200W PSU for the 5090On top of the $2K price of the card, don't forget to upgrade to at least a 1Kw PSU (and that might not be enough . . . so get a 1200-watt PSU, and hope you can find a 1500-watt soon).
31% more profit for shareholders .....What the hell is the point of a card that's 30% faster while costing 30% more and using 30% more power?
I saw the video a few days ago. I was surprised he seems only to try power-limiting the card, and not actually undervolting it by controlling the offset vf curve in any way. I wondered if it’s because it’s not enabled in the drivers yet. There would surely be some additional efficiencies gainable by capping power using that method, as previous nvidia cards have all had.For anyone curious, Der8auer has posted a great preliminary look at undervolting, seeing some pretty strong efficiency gains. Running the card at a 70-80% power envelope barely touches the performance in some titles. I'd just do that by default, honestly.
Excellent suggestion. I can think of no downside to helping destroy the planet I live on.
Maybe consider changing your ophthalmologist.Not really. If you have a 360hz monitor (I have) and you want to play games maxxed and with more than 200 FPS, this is the card (I won't buy because I already got spent big on last gen).
But it would be AMAZING to play current triple A games on a good monitor with this card.
Alas, the human eye cannot see beyond 30 fps.
That was 698W for the GPU and CPU (the graphs are labeled PCIe + EPS, the CPU power connector). Article also mentions "Note that this data combines both CPU and GPU power usage, as GeForce GPUs are known to increase CPU load in certain scenarios, which can reduce GPU load and, in turn, lower power consumption."Techspot measured 698W at the PCIE slot and 12VHPWR connector in Cyberpunk 2077 at 4K.
"Alas, the human eye cannot see beyond 30fps" was, without a doubt, sarcasm, lol. "Owns a 360Hz monitor" was the biggest tip-off in that post.Maybe consider changing your ophthalmologist.
Ah, thanks for the catch, I've been so used to GN measuring card power with shunts that I just assumed that anything not labeled 'Total System Power' in a GPU review would have been GPU-only.That was 698W for the GPU and CPU (the graphs are labeled PCIe + EPS, the CPU power connector). Article also mentions "Note that this data combines both CPU and GPU power usage, as GeForce GPUs are known to increase CPU load in certain scenarios, which can reduce GPU load and, in turn, lower power consumption."
Even so, the worst CPU+GPU totals are still almost within spec for just the GPU connectors. 12VHPWR is good for 600W, and you can draw 75W from the PCIe slot, so what's 23W between friends? NVIDIA should really just slap a CPU socket on there and give us access to 32GB of unified system memory.
So basically we are in the same place we have been since 3090 TI which was also just about 520W max. Nvidia officially released the GeForce RTX 3090 Ti on March 29, 2022.Ah, thanks for the catch, I've been so used to GN measuring card power with shunts that I just assumed that anything not labeled 'Total System Power' in a GPU review would have been GPU-only.
igorslab does separate out GPU vs System power, and the GPU itself doesn't exceed 520W sustained power in their tests.
I haven't checked the 3090 Ti comparison specifically, but the general consensus has been ~30% power for ~30% perf in RT and rasterization when technologies like FG or MFG are not enabled.So basically we are in the same place we have been since 3090 TI which was also just about 520W max. Nvidia officially released the GeForce RTX 3090 Ti on March 29, 2022.
Started VR with an Index and a 1660 Super. Most games were playable, but FS2020 was not. 3060Ti runs everything without issue now after upgrading when it came out. Most of what I do are flight sims, party games like Keep Talking, First Person or puzzle games like I Expect You to Die. FS2020 is by far the hardest on the system. Phasmaphobia is much better in VR!Heh. Wondering now - DLSS and frame generation add artifacts, would those be worse if you used this thing for VR? I remember VR had trouble pushing the needed amount of frames at the right res even with the biggest, baddest cards running tandem.
Disclaimer: I don't even own games that are VR-compatible, everything secondhand with me.
And now we get these effects from DLSS and Frame Generation. Progress!once reviewers started benchmarking more than just average FPS, it became apparent that SLI's gains came at the cost of latency and consistency.
Have you played some modern games without DLSS? Like, Indiana Jones? NPC faces are blurry, DLSS or no DLSS. It's just that modern engines do all shit in screen space, so proper antialiasing (MSAA) is not really an option, leaving us with the shit that we have.And now we get these effects from DLSS and Frame Generation. Progress!
I'm glad the economy sucks so bad that $2k graphics cards are an option
True. But they make it a requirement to very prominently point out how shit your games are if you require either a 3rd-Party Account, 3rd Party DRM, Kernel Level Anti-Cheat and to a lesser extent if you disallow Family-Sharing, RemotePlayTogether & aren't SteamDeck Verified/PlayableI don't feel that Valve has that much interest in keeping games off of Steam due to performance. Many games already have generic system requirements or jokes like "CPU: le potato".
The only requirements for publishing on Steam these days are 1.) Do you have $100? 2.) Do you have an EXE? 3.) Is it a virus?
I have an early 24" 4K monitor. I keep it around because it was professionally color calibrated and because it's easy to move. Such panels definitely exist.Maybe at 27" (I don't really believe you) but there are much larger screen sizes where 4k is desirable