News:

Willkommen im Notebookcheck.com Forum! Hier können sie über alle unsere Artikel und allgemein über Notebook relevante Dinge disuktieren. Viel Spass!

Main Menu

Post reply

Other options
Verification:
Please leave this box empty:

Shortcuts: ALT+S post or ALT+P preview

Topic summary

Posted by Jace
 - February 10, 2025, 16:30:26
Fake raw performance is 10 to 15% better then rtx 4080 super because I have both gpu DONT TELL FAKE NEWS LA
Posted by Shadman
 - February 10, 2025, 12:18:06
Quote from: Alp on February 10, 2025, 03:08:55You dont need GPU compute in the future. You even dont need a gpu at all to render graphics. A small AI chip will do the trick with a llt kf power efficiency and the current way of huge graphics cards will be just a memory or souvenir. So what matters is how it performs under the latest DLSS and frame generation.

It's not the future, it's the present. It doesn't matter what might happen in 5-20 years.
Posted by Alp
 - February 10, 2025, 03:08:55
You dont need GPU compute in the future. You even dont need a gpu at all to render graphics. A small AI chip will do the trick with a llt kf power efficiency and the current way of huge graphics cards will be just a memory or souvenir. So what matters is how it performs under the latest DLSS and frame generation.
Posted by Ebike ops
 - February 09, 2025, 14:41:08
Glad I didn't buy one. Even if I could (Sold Out) Pity the foolz
Posted by Illrigger
 - February 09, 2025, 11:24:13
Quote from: Hunter on February 08, 2025, 14:27:45Would have liked a bit of learned speculation on the causes. It's not like we don't know the cards are built to be faster, so what's up? My best guess is the drivers aren't optimized yet, or there's something goofy the test itself is doing wrong that isn't properly using the hardware. There's just not a good normal reason a card with more cudas and more pcie bandwidth etc would be slower.

Just look at the chart to see what's going on. The 4090 is scoring significantly slower than the RTX 6000 Ada as well, which is the same silicon as the 4090 with more cores unlocked and twice the RAM. What we are seeing is a small sample set of stock FE level cards vs a large sample set of highly optimized cards with every tweak possible done to a lot of them. This is why despite Passmark and UserBenchmark touting how much better disseminated benchmarking is vs controlled benchmarking, it's just as useless. The end result is that only people interested in bragging about their systems run benchmarks, so the benchmark results are just as skewed by bias as the YouTube reviews are. At least on YouTube you can see what their methodology is. For all we know every one of the 5090s that those scores came from were purposely crippled in some way to make the card look bad so that articles like this would be written. Sure, it's unlikely, but the possibility that it's true is still very possible with only 13 results in. Until there are a few thousand more data points, talking about the results is kinda moot.
Posted by NotingEssential
 - February 09, 2025, 01:47:06
Quote from: NotingEssential on February 09, 2025, 01:45:24
Quote from: OkinSama on February 08, 2025, 20:17:06Well of course it's basically just a 4090. They're shifting paradigms to AI, and that's what happens.
Every time GPU companies shift to a new way of doing things, the first entries into that territory tend to take baby steps, as they work out how to best use the new methods.
Future models, or even just after some driver updates, will see significant performance gains.

Shifting paradigms? More like shafting customers.
If it's basically a 4090 it should cost like a 4090, and maybe Jensen could generate the price difference with AI at a later date.
Fixed mistake.
Posted by NotingEssential
 - February 09, 2025, 01:45:24
Quote from: OkinSama on February 08, 2025, 20:17:06Well of course it's basically just a 4090. They're shifting paradigms to AI, and that's what happens.
Every time GPU companies shift to a new way of doing things, the first entries into that territory tend to take baby steps, as they work out how to best use the new methods.
Future models, or even just after some driver updates, will see significant performance gains.

Shifting paradigms? More like shafting customers.
If it's basically a 4090 it should cost like a 5090, and maybe Jensen could generate the price difference with AI at a later date.
Posted by Ka
 - February 09, 2025, 01:12:30
Could this be a driver issue?
Why does author feel the 5090 is unlikely to improve?
Posted by OkinSama
 - February 08, 2025, 20:17:06
Well of course it's basically just a 4090. They're shifting paradigms to AI, and that's what happens.
Every time GPU companies shift to a new way of doing things, the first entries into that territory tend to take baby steps, as they work out how to best use the new methods.
Future models, or even just after some driver updates, will see significant performance gains.
Posted by Hunter
 - February 08, 2025, 14:27:45
Would have liked a bit of learned speculation on the causes. It's not like we don't know the cards are built to be faster, so what's up? My best guess is the drivers aren't optimized yet, or there's something goofy the test itself is doing wrong that isn't properly using the hardware. There's just not a good normal reason a card with more cudas and more pcie bandwidth etc would be slower.
Posted by Cragscrambler
 - February 08, 2025, 13:58:40
Nvidia dosent care, Gaming GPUs are nothing more than an after thought in their business model these days. They are earning the company less than 10% of their revenue and even then 90% of that is through OEM machines.
Posted by TJJ
 - February 08, 2025, 10:42:48
Those results seem.... Wrong.

While architecturally there might not be much separating the 4090 & 5090, and the 5090 does clock a little slower, the 5090 is a much bigger chip with far more CUDA cores, and vastly better memory bandwidth.

This must be a result of the small sample size & large numbers of overclockers in the 4090 set.
Posted by TruthIsThere
 - February 08, 2025, 05:01:27
Uh-huh!!!

These benchmarks were not even necessary for anyone to arrive to the conclusion that the Blackwell architecture IS the biggest over-hyped, (the MSM, YTers, ect.) just ridiculous, overpriced paper launch product in the history of GPUs, maybe even in the entire tech industry.

Why?! Easy, and a super conclusive answer - Jensen, DELIBERATE, super failure of him to purposely not demonstrate the RTX 5090 FULL performance features in Tensor/creative workloads numbers on its introduction date.

90% of that weak introduction was ONLY about NVIDIA's DLSS4/Multi-Frame Generation nonsense.
Posted by Redaktion
 - February 07, 2025, 21:01:54
The GeForce RTX 5090 has failed to overtake its predecessor in a GPU compute benchmark on the well-known PassMark database site. The RTX 5080 suffered a similar fate, with the card barely managing to beat out the RTX 4070 Ti. The GeForce 50 series Blackwell cards support PCIe 5.0 connectivity via 16 data lanes.

https://www.notebookcheck.net/GeForce-RTX-5090-fails-to-topple-RTX-4090-in-GPU-compute-benchmark-while-RTX-5080-struggles-against-RTX-4070-Ti.958334.0.html