Even setting aside Frame Generation, this is a seriously fast, power-hungry GPU.
See full article...
See full article...
The armchair hardware engineers that thought this would be a small bump to 4090 performance are not looking so good right now. That's a pretty impressive card all around.
That said, no one is going to be paying $2000 for it. I really wish we would stop using the Founders Edition price as the baseline for costs, because those are generally impossible to find and none of the board partners sell for that price. I'd be surprised if the MSRP can be found for less than $2200.
Seriously. Talking about damning with faint praise... In case the a/b headline testing changes things, we're both reacting to this headline:
Nvidia’s GeForce RTX 5090 is the first GPU that can beat the RTX 4090"
You are correct. No one is going to pay $2000 for it. You can be assured the scalpers will pay them all up on day one and charge $2500-$3000 for them and people will buy them.
Heh, my workstation 4090 + w3495x can heat the room by several degrees when I ask it to run a erosion simulation. It's an underappreciated problem. A machine that's pulling 800-1100 watts for hours at a time is basically a space heater. They are at the point where for a high end machine you may need to talk with your HVAC contractor it keeping the temperature comfortable is a requirement.Can it be used with the heat pump in the house?
Back in the early oughts, we had LAN parties late at night since they produced so much heat back then and we lived in teh desert. Would be hard on home circuits with multiples of these.Heh, my workstation 4090 + w3495x can heat the room by several degrees when I ask it to run a erosion simulation. It's an underappreciated problem. A machine that's pulling 800-1100 watts for hours at a time is basically a space heater. They are at the point where for a high end machine you may need to talk with your HVAC contractor it keeping the temperature comfortable is a requirement.
The other problem is the UPS. The current 1500va ups occasionally beeps because the attached computer goes over it's rated capacity. With a 5090, that's going to be a regular occurrence. Consumer grade UPSs are not ready for this.
That’s good to hear. I hope they work their way up the performance ladder.Intel denied this at CES.
This video about the cooling system of the 5090 shows they had a 4-slot prototype:It will clearly be an inferior graphics card if it is only taking up 2 PCI slots. If it was better, it would be at least 3 spaces or even 4 to be top-of-the-line. And don't forget the dedicated power supply that has its own breaker switch in my home.
/s
that’s my understanding from what I have read. They seem hell-bent on sticking with a monolithic chip architecture. they’re gonna ride that horse all the way into the ground. I just can’t help but see that for a 30% performance increase use 35% more power? yeah that’s definitely a scalability issue.Its kind of crazy how much larger this GPU is than the 4090 (in core count, area, and power budget) compared to its uplift. They must be hitting some sort of scalability wall in their design.
Chiplet architectures are much harder for GPUs than CPUs. AMD got an efficiency improvement from RDNA3 versus RDNA2, but not an enormous one.that’s my understanding from what I have read. They seem hell-bent on sticking with a monolithic chip architecture. they’re gonna ride that horse all the way into the ground. I just can’t help but see that for a 30% performance increase use 35% more power? yeah that’s definitely a scalability issue.
Anecdote: I have a VR display (Samsung Galaxy), but I hate using it, so I'm out of the VR loop by a couple years.Heh. Wondering now - DLSS and frame generation add artifacts, would those be worse if you used this thing for VR? I remember VR had trouble pushing the needed amount of frames at the right res even with the biggest, baddest cards running tandem.
Disclaimer: I don't even own games that are VR-compatible, everything secondhand with me.
Sounds more like either total system power or the reviewer messed up bad.TGP is apparently not 575W but 725-750W in "some" reviews online? What are these reviewers doing wrong or not doing correctly? (this is with the founders edition, not AIB, so one would assume it should be locked)
Why would Nvidia publish TBP/TGP at 575W when its exceeding the limitations of their connector?
And the performance increases of 2000 - 2006 were proportionally much larger than those from 2012 - 2018. Remember when Nvidia had an ironclad rule about a new architecture every year, with a six-month kicker refresh in-between?Another disappointing entry in what has become a stagnant industry.
Performance increases from around 2012 to 2018 were massive by comparison to what we've seen from 2018 through 2024.
A 30% performance increase sounds massive too, but it's still a relatively gradual improvement within a plateau of performance increases at this point.
I wish I can do that.... the CF reinforced plastic prints aren't as good as the stamped steel cases...at this point I want an S shaped case to stick this thing out the side, or in it's own little wind chamber.
I've had that a few times myself with a 4090 and 13900k. I'm seriously considering a 2nd ups for just the computer and put the monitor and assorted peripherals on the first only.Heh, my workstation 4090 + w3495x can heat the room by several degrees when I ask it to run a erosion simulation. It's an underappreciated problem. A machine that's pulling 800-1100 watts for hours at a time is basically a space heater. They are at the point where for a high end machine you may need to talk with your HVAC contractor it keeping the temperature comfortable is a requirement.
The other problem is the UPS. The current 1500va ups occasionally beeps because the attached computer goes over it's rated capacity. With a 5090, that's going to be a regular occurrence. Consumer grade UPSs are not ready for this.
That definitely helps. Did that a while back and it mostly solved it for me. Should completely fix your issue with the lower TDP on the 13900k.I've had that a few times myself with a 4090 and 13900k. I'm seriously considering a 2nd ups for just the computer and put the monitor and assorted peripherals on the first only.
My guess is a mix of all but perhaps mostly 1. If you use an AIO, using a pull setup rather than push may make more sense with a 5090FE in the case. That way the CPU will get fresh cool air rather than air that has been mixed with the 5090FE's exhaust.I wonder what the underlying mechanism is? I can think of 3 things (possibly multiple/all contribute):
1) the new card puts off more heat in the computer, making the overall air warmer so it's harder for the heatsink to cool the CPU
2) the new card restricts airflow by physically blocking the space where the air would normally be flowing into the case
3) The faster card leads to higher CPU utilization as the whole system is able to run faster, leading the CPU to generate more heat because it's doing more work
Not really but I would look for number and size of mounts on front back and top for fans. That is really what is going to be the thing.Somewhat related question - does anybody know if there are any good resources out there rating case airflow/cooling? While I'm too cheap to ever see myself with a 5090, I'm in the market for a desktop after my old one bit the dust, and I'd like to optimize cooling as much as possible. I feel like it's always useful to keep your components as cool as possible, but there are so many options out there.
Keeping Nvidia stock price up?What the hell is the point of a card that's 30% faster while costing 30% more and using 30% more power?
Its kind of crazy how much larger this GPU is than the 4090 (in core count, area, and power budget) compared to its uplift. They must be hitting some sort of scalability wall in their design.
Don't try to do it locally. Just use your 3090 for development.As I'm actually in the market for a new build right now (quite possibly my "last one" as I will hopefully retire in a couple of years), this was a fun comment thread (no offense, Ars, but for the details I'll go to GN).
I may actually "have to" go for a 5090 over a 5080, because it's the AI that I now dive into nolens-volens, grumbling, kicking and screaming. Gotta do what pays the bills - and also what makes the whole thing tax deductible where I live. My 3090 is starting to creak (no OC)
Techspot measured 698W at the PCIE slot and 12VHPWR connector in Cyberpunk 2077 at 4K.Sounds more like either total system power or the reviewer messed up bad.
Anecdote: I have a VR display (Samsung Galaxy), but I hate using it, so I'm out of the VR loop by a couple years.
However, it is generally considered that nvidia cards have the best frame reprojection for VR displays, which syncs camera movements to your head movements regardless of whether the update happens during a frame refresh or not.
I expect the 5000 series to be at least as good as previous gens, and I also expect all the DLSS4 tech (including FG) to play nicely with their existing infrastructure.
Why stop there? Blower-design cooling can make a comeback, but this time using traditional 120mm fans aimed out the back of a 7-slot graphics card. Especially since the actual PCB area isn't much longer than 120mm, and the height of PCI slots are just above 120mm. Just imagine the beauty of TWO gigantic heatpipe blocks inside your case. Two cubed is four times as cool! That's reason enough to skip water-cooling, right there.It will clearly be an inferior graphics card if it is only taking up 2 PCI slots. If it was better, it would be at least 3 spaces or even 4 to be top-of-the-line. And don't forget the dedicated power supply that has its own breaker switch in my home.
/s
IIRC, it was cable connectivity issues that caused the melting, not just the amount of power drawn.It is a great piece of engineering, however I have two issues that I cant shake from my mind: the first one is the ratio power performance, in order to get 30% extra performance it uses 30% extra power. Does this mean we already hit the limits and we are just trying to build around it (frame generation technology seems to indicate this).
My other concern, and it is related to the first one, is if the amount of power drawn in the prev gen was already causing serious problems (read melted connectors) is there a way to avoid it now? The tilted plug helps with cable management, but does it help with the temp itself?
Reviews for the 5080 are going to be interesting. The 5090 seems like it might “get away” with 4-frame generation without incurring unreasonable lag because it has also enough grunt to achieve a decent “real” frame rate. Meanwhile the 5080 is a lot less powerful in terms of raster and raytracing; frame gen might feel the same playability-wise, or it might feel a lot worse.It's faster. People used to pay a huge premium for even less additional speed. If you don't need it, don't buy it. I'm looking forward to the 5080 series myself.
Seriously. I have a 4k and a 1440p monitor, both 27", sitting next to each other at the same viewing distance of ~90cm, so I get plenty of opportunities to compare how games look across them. Honestly, it's very difficult to justify the performance cost of 4k given how good 1440p with DLAA looks.
I did plan to potentially upgrade my RTX 3070 this generation, but it's largely because of how much of an issue 8GB of VRAM is starting to be in some games and also because getting at least 120fps is a noticeably better experience. I am a bit curious though whether getting frame generation on the RTX 3-series will convince me to wait till 2026 or beyond...
That's because 4K is a perfect multiple of 1080p. This allows for integer scaling.My problem is I have a 4K display for work and all resolutions except HD and 4K look blurry. Is there a 4K display (or higher) that looks sharp at 1440p? Don't have space for 2 displays either.
What the hell is the point of a card that's 30% faster while costing 30% more and using 30% more power?
Buy the card instead and let it mine Bitcoin while it is heating the room. It may pay itself back eventually.I can get a space heater just as good for under $50.
Actually, blame Nvidia for not paying for N3 like Apple did.It's on the same process. Blame tsmc.
Can’t say I agree that slapping on top a giant fan and heat sink to cram in as much transistors as possible is great engineering.great piece of engineering