News:

Willkommen im Notebookcheck.com Forum! Hier können sie über alle unsere Artikel und allgemein über Notebook relevante Dinge disuktieren. Viel Spass!

Main Menu

Early real-world NVIDIA GeForce RTX 3080 benchmarks point to 80% performance gains over the GeForce RTX 2080 in multiple triple-A games

Started by Redaktion, September 04, 2020, 15:17:53

Previous topic - Next topic

Redaktion

Initial 4K benchmarks of the GeForce RTX 3080 have been posted, and things are looking good for NVIDIA's new Ampere architecture. NVIDIA's latest US$699 card can outperform the GeForce RTX 2080 by around 80% in games like Borderlands 3, Control and Doom Eternal, which is a huge leap in one generation.

https://www.notebookcheck.net/Early-real-world-NVIDIA-GeForce-RTX-3080-benchmarks-point-to-80-performance-gains-over-the-GeForce-RTX-2080-in-multiple-triple-A-games.491527.0.html

ngridia

QuoteDigital Foundry benchmarked the RTX 3080 at 4K and maximum graphics settings
This is why is so important to wait for real independent objective technical reviews. Digital Foundry ran test under such conditions that 2080 drown because of memory limitations. And now we have fictional +70-90% performance gains.

Sam Adams

Yay, you win the award for dumbest comment yet!  ;D

If those settings uses up all available vram in the 2080 it still shows performance increase due to ram limitations.

Njiwahr

Quote from: ngridia on September 04, 2020, 19:10:11
QuoteDigital Foundry benchmarked the RTX 3080 at 4K and maximum graphics settings
This is why is so important to wait for real independent objective technical reviews. Digital Foundry ran test under such conditions that 2080 drown because of memory limitations. And now we have fictional +70-90% performance gains.

What are you? An idiot sandwich?
If the 2080 would've been VRAM limited, we'd see the framerate drops all the time. You have no idea how VRAM works.

Ronald King

The point is that it's only going to see performance gains such as these when the 2080 would have been severely crippled.

In real world scenarios, you know like 1080@60, 1080@144, or even 1440@60/1440@144 the performance gain, while still substantial 35-50% is not going to come anywhere close to those being reported.

For those talking about VRAM and tossing names around:  Nobody knows you're ignorant until you open your mouth and prove it.

Mister2formw

Don't you think its a bit odd that ONLY DF had access to the cards for these this?  It was very likely a managed test under very specific conditions. So I think its a bit negligent on Alex's part to stoke the hype without being realistic.

Listen, I'm all for upgrading to next gen, but let's cut the tribalism amd wait for objective reviews of the 30 series AND Big Navi so we can actually make informed decisions.

And I'm sure it can be 80% in specific games that can benefit from fp32. But that's not every game. Hell my 2080 ti could barely beat a 5700xt in a couple games (like deus ex mankind divided). It's all in what it uses.

Jesse

Great.  I hope AMD comes out with something competitive and forces graphics card prices back down to earth.

Johnny80

Quote from: Mister2formw on September 05, 2020, 11:20:27
Don't you think its a bit odd that ONLY DF had access to the cards for these this?  It was very likely a managed test under very specific conditions. So I think its a bit negligent on Alex's part to stoke the hype without being realistic.

Listen, I'm all for upgrading to next gen, but let's cut the tribalism amd wait for objective reviews of the 30 series AND Big Navi so we can actually make informed decisions.

And I'm sure it can be 80% in specific games that can benefit from fp32. But that's not every game. Hell my 2080 ti could barely beat a 5700xt in a couple games (like deus ex mankind divided). It's all in what it uses.

There is a difference at 1080p you will be more cpu limited why on earth would you not run DSR and render it up to 4K reducing the AA. At the same time you will notice GPU utilisation much lower at these resolutions compared to a 2080. There's still an advantage over the Turing architecture... to be realistic an upgrade from pascal to Turing was a waste of money hec probably even a waste if you still were on maxwell.
however an upgrade from Turing to ampere will be definitely worth it time I feel bad for those who bought Turing a couple of months ago at those high prices

tomgillotti



Brett

I'll be waiting for older rasterized benchmarks. These increases are with RTX on. What about games that don't support RTX? I doubt the increase will be that significant...

Quick Reply

Warning: this topic has not been posted in for at least 120 days.
Unless you're sure you want to reply, please consider starting a new topic.

Name:
Email:
Verification:
Please leave this box empty:

Shortcuts: ALT+S post or ALT+P preview