The Nvidia GeForce RTX 5090 price will reportedly see a massive 25% increase over the RTX 4090 per a new leak posted on X. The leak also confirms the previously reported 32 GB VRAM as well as the release date.https://www.notebookcheck.net/Eye-watering-RTX-5090-price-leaks-alongside-possible-January-release-date.909797.0.html
Just remember nVidia. Payback's a bitch....
RTX 5090...that's darling!
I'm gonna hang onto my RTX 4090 for a minute.
5090 isn't the huge leap forward that the 4090 was.
Maybe the 6090 (assuming Nvidia is still in business) will be something special.
This class of video processor is much like the Laser in 1969!
A technological solution in search of a problem it can solve.
Hydrate before coffee, or not.
Nvidia is not going to sell a card like the 5090 cheap that somebody's not going to need to upgrade for many many years.
I'm surprised that the 4090 was as cheap as it was. That card is going to be good for many many years.
Quote from: Greg on October 28, 2024, 16:31:28RTX 5090...that's darling!
I'm gonna hang onto my RTX 4090 for a minute.
5090 isn't the huge leap forward that the 4090 was.
Maybe the 6090 (assuming Nvidia is still in business) will be something special.
This class of video processor is much like the Laser in 1969!
A technological solution in search of a problem it can solve.
Hydrate before coffee, or not.
3rd Leg Greg, too concur with you. I was just talking about this 5090 isn't nearly as a leap like the 3000 series to the 4000... I'll still grab one... I have uses for my old 4090 already lined up..
150w more and 25% more...that's a pass for me, see what 6090 or RDNA5 brings
the xx90 models have always been a halo product never designed for anyone outside those with deep pocket or the need for a Titan class GPU just not at Titan prices. With the 5090 it looks like Nvidia is OK with seeing how far they can stress test the users on the price. No doubt the lemmings with said deep pockets will follow Nvidia right off the cliff. The real question is the 5080 series and the cost. Nvidia backpedaled on the $1200 but they seem to be at ease with pricing the xx80 series at the $1000 range and with AMD outright saying they are abandoning their high end. Which is to say anything that is competing with the xx80 series, there is little reason for Nvidia to fall back to the 3080 prices or dare I even dream of the 2080 prices. Even with inflation the costs of those two models of cards is insanely cheaper then the 4080 series.
All that said. I ended up paying $2000 for a 3090 FE back in 2021 as I had a power surge that killed my system. (Just because you have a UPS on your system doesn't mean it can't come in from the projector's HDMI port > cable > GPU. I needed something and I said the heck with it and paid the extortion fee.
That release date may make sense for the author of this article, but I think most consumers look to make their big purchases on special occasions like Christmas.
Ridiculous. No thank you. I didn't pay 2k for the 4090 and I won't for the 5090 either.
5080 with 16 GB again is a joke. Gues s I'll keep the 4080 for a decade and see if these knob ends grow a brain.
32GB of VRAM is beyond overkill. No reason to even need that type of power when no title will come close to using that. I just don't see why anyone would purchase a 5090 when a 4080 super (currently $949) would play any title flawlessly for the foreseeable future.
For developers and users of local running AI applications this is a great product and the price is fine as the competition is the workstation and server grade nvidia GPUs that are even more expensive.
And 32 GB VRAM is for this usecase a nice increase although still little. So expect that people continue to build rigs with many 5090 at the same time.
The 2080 Ti was $1400 at launch. The five $999 cards that were produced that generation may as well not have existed.
At this point your going to need to bring in a 220v service in to your living room.
Without p2p memory sharing - the consumer cards took a huge dive in value. The last card with shared memory over pie was the 1080ti. That was intentionally gimped on every future consumer card.
It makes sense to gimp a consumer card to distinguish from a real AI card with memory sharing (over pcie or NVLink)... but still charging a premium for it is a bridge too far.
pcie^
Quote from: Numnutts on October 28, 2024, 23:03:5232GB of VRAM is beyond overkill. No reason to even need that type of power when no title will come close to using that. I just don't see why anyone would purchase a 5090 when a 4080 super (currently $949) would play any title flawlessly for the foreseeable future.
It may be for games but I am dependent on CUDA for work and my renders literally starve and crash the software or in blenders case, crash windows with my measly 8gb vram of my old 3070. I 'llst' my 4090 and it literally costs me twice the time and nerve to work on things.
16 gig are not enough either, I have loads of scenes that ate more than that on my ex-4090.
So no it's not overkill. 24 is what xx80 should have, 32 instead of 24 is nice but at that price I want 64 gig vram.
So yea that sucks. I and literally all my freelancer colleagues are forced to buy high-end Nvidia gpus because:
- having cuda or not is a non-negotiable
- GPU renderers don't handle vram well, so either you have more than needed or say goodbye to your free time and weekend. Redshift in Maya has gotten a lot better. Blender is literally the worst and unusable with 8gb vram (and cycles is not very fast either)
So yeah, looking forward to buying that overpriced thing..
But yeah, 4090 is overkill and not necessary for games no doubt about it unless you don't know what you are doing (using 8xMsaa, setting things to ultra, which in 99% of cases does nothing more than use uncompressed textures which you will never notice but it's a good reason to tell players that there's a reason to have a 4090).
Who has that kind of cash in this economy for a gaming GPU????
Thank you Nvidia for convincing me to keep my 3090 for 2 more years!
Well we're here.
The simple-minded have helped the lether mam have his way.
"It's more performamce so it sHoUlD be more expensive" even though for ages it was the entore stack is getting REPLACED.
Tis is why normal folk need to be left out of everything. We have fewer amd fewer nice things, and the nice things we do get are pushed further into the vanity realm because "reasons."
Yall better be proud of yourselves.
Quote from: Greg on October 28, 2024, 16:31:28RTX 5090...that's darling!
I'm gonna hang onto my RTX 4090 for a minute.
5090 isn't the huge leap forward that the 4090 was.
Maybe the 6090 (assuming Nvidia is still in business) will be something special.
This class of video processor is much like the Laser in 1969!
A technological solution in search of a problem it can solve.
Hydrate before coffee, or not.
Why would Nvidia go oit of business? They're making stonks selling to prosumer HPC and server.
It's Intel who looks like they may be 2 inches from the bottom of a spike pit.
Quote from: Numnutts on October 28, 2024, 23:03:5232GB of VRAM is beyond overkill. No reason to even need that type of power when no title will come close to using that. I just don't see why anyone would purchase a 5090 when a 4080 super (currently $949) would play any title flawlessly for the foreseeable future.
You must not play anything new, or 4k, or hyper modded, or paid any attention to the devs collective VRAM outcry, or do more than game......
Or all of the above.
These things have far more use than pushing pixels. Have for ages.
So. The card will sell out anyway and be scalped! Just making it more expensive! Nothing will change from the 4090 era. AMD wishes it had it like this!😂 It is an enthusiast card directed at that market. The most popular card in the world is going to be the 5060 (fifty sixty). You liked that didn't you?🫵 You're welcome!
Nvidia's graphics cards business is a nuisannace at this point... its practically a rounding error, and something they dont even want to deal with anymore... given how much they are making off their AI hardware and ventures... graphics cards are an afterthought.
Nvidia is NOT going out of business any time soon - to even mention otherwise is an uninformed take.
Quote from: Numnutts on October 28, 2024, 23:03:5232GB of VRAM is beyond overkill. No reason to even need that type of power when no title will come close to using that. I just don't see why anyone would purchase a 5090 when a 4080 super (currently $949) would play any title flawlessly for the foreseeable future.
I will say no not for my use I don't play games I work with AI modelling. I hoped it came with even more vram I need two of these to do fine tuning of my model and that is bare minimum. In comparison I need 3 off 4090 so it's cheaper with 2 of these than 3 off 4090. For gamers I have no idea what they need but I guess 5080 will do fine.
So the 5090 is like 4090 an overkill for most but a really good tool for some, which is why it's made a niche card. Not in A6000 or A100 class but for those with lower budgets.
I could use lots of CUDA and Tensor cores for higher speed so, enough cash presumed, might buy, say, 7 or 8 mid to high end Nvidia cards. However,
- such 90 tier cards alone would amount to roughly €15000 ~ 20000 (plus PC hardware with water cooling)
- the yearly power bill (in Germany) would be several thousand euros
A compromise with 70 tier or 80 tier cards (7 or 8 of them) would bring down their price to €5000 ~ €10000 but the yearly power bill would still be a few thousand euros.
AI inference would be fun at that speed but prohibitively expensive even with moderate wealth. It really boils down to "more than 1 card, or at best 2 cards, is luxury", unless the faster work pays itself in one's business at least proportionally to the expense.
It does not cost that much extra to add the ram, this pricing is an insane markup, simply because some fools are willing to pay, not that their production costs are near that high. Some will say supply and demand, but it's just greed.
Price on paper and in real life are too very different things. What's the street price going to be?
Personally, I'm waiting for AMD to jump back in after their skipped season. I'm watching the AMD AI progress, too. 7900 XTX here, mostly for 3440x1440x gaming, but getting into AI work on the side now.
Why do clickbait scum bags like this get away with appearing on primary news hubs?
From a clearly clickbait headline:
'EYE-WATERING RTX 5090 price leaks'
To nothing else at all about the actual price other than standard guesswork like all of these other trashcan bottom feeders.
I hope AMD's strategy pays off. Gamers are right to be frustrated that pricing has gone to extremes, first from crypto mining and then from AI.
At the same time, I don't want to be limited to mainstream performance (1440p). I have an 8k display and I want native rendering because dlss still has distracting artifacts.
Unfortunately the AI surge (and tsmc having a virtual monopoly on advanced nodes) is keeping wafer prices high enough that there's really no way to economically produce the GPU I want at a reasonable price - unless using software hacks like dlss.
I just hope that eventually prices return to normal instead of people becoming accustomed to the Ngreedia tax.
Quote from: Numnutts on October 28, 2024, 23:03:5232GB of VRAM is beyond overkill. No reason to even need that type of power when no title will come close to using that. I just don't see why anyone would purchase a 5090 when a 4080 super (currently $949) would play any title flawlessly for the foreseeable future.
That's your problem, your using a xx90 series card for gaming. I need it to churn out as many renders as fast as it can.
I use the ram to run 3x 4k screens and my VR headset.
I need as much ram as i can get because when i add the headset, it adds another 2x 4k screens. I have a Pimax Crystal.