News:

Willkommen im Notebookcheck.com Forum! Hier können sie über alle unsere Artikel und allgemein über Notebook relevante Dinge disuktieren. Viel Spass!

Main Menu

Nvidia's DLSS 3 was the biggest loser of CES 2024

Started by Redaktion, January 23, 2024, 15:42:22

Previous topic - Next topic

Redaktion

NVIDIA showed off a lot at their CES keynote back on the 8th of January. They showed off their Super series of GPUs. They showed off their renewed belief in saying "AI" a lot to boost their stock market performance. But they also showed off the fact that their vaunted performance-boosting frame generation technology is pretty performance-hungry, too.

https://www.notebookcheck.net/Nvidia-s-DLSS-3-was-the-biggest-loser-of-CES-2024.795292.0.html

Nicholas Vidia

looks like logic of this is flawed inside out.
how was "4080(minus generated frames)" and "DLSS 3 computation time" calculated?
frame generation works only once every other frame and can actually just sit and wait there for proper time to display the frame, so that frametime or fps doesn't jump. if it's adding fps - it's minimum about as fast (most probably faster) as rendering scene at 1/4 resolution and upscaling it

H.K

You have calculated it wrong. The 4080 was up to 30% ahead of 3080Ti in 3rd party non DLSS benchmarks when it came out, how can 4080Super be possibly on par with 3080Ti? Nvidia is Greedy but they are not Stupid.

Parry Hotter

These Super graphics cards are not brand-new products, they are merely "slightly better" versions of existing graphics cards. Of course the former aren't miles ahead of the latter. Of course NVIDIA had to do lots of cherry-picking to make things look good.

m.lee

#4
Hi all, just to clarify a few points:
Quote from: Nicholas Vidia on January 23, 2024, 21:29:20looks like logic of this is flawed inside out.
how was "4080(minus generated frames)" and "DLSS 3 computation time" calculated?
frame generation works only once every other frame and can actually just sit and wait there for proper time to display the frame, so that frametime or fps doesn't jump. if it's adding fps - it's minimum about as fast (most probably faster) as rendering scene at 1/4 resolution and upscaling it
- "4080 [Super] (minus generated frames)" is a simple doubling of the 4080 Super's "DLSS 3 ON" frame times, so the time between every other frame. As you say, framegen only creates every other frame, so that means we're looking at the time between every properly rendered (i.e. not generated) frame.
- This is how long it takes for the 4080S to spit out two frames (one rendered and one generated). We can estimate how long the 4080 S would take to normally render a frame by looking at the claim of it being 1.4x as fast as the 3080 Ti, as we have framerates for the latter.
- Then we subtract "time the 4080 S should take to render 1 frame" from "time the 4080 S takes to render 1 frame and then generate 1 frame", which leaves us with a rough value of how long DLSS 3 framegen takes. In starfield, for example, the 3080 Ti does one render in ~9.7ms, so our 1.4x speed 4080 S (DLSS 3 OFF) should render in 6.9ms. With DLSS 3 ON, two frames take 6.1 * 2 = 12.2ms to display, meaning that after taking 6.9ms to render one it takes 5.3ms to generate another... which is not great.

Quote from: H.K on January 24, 2024, 06:24:33You have calculated it wrong. The 4080 was up to 30% ahead of 3080Ti in 3rd party non DLSS benchmarks when it came out, how can 4080Super be possibly on par with 3080Ti? Nvidia is Greedy but they are not Stupid.
There is no claim at all that 4080 S = 3080 Ti in non-DLSS scenarios. If you look at the bottom half of the first image, games without frame generation show the 4080 S as being ~41% ahead of the 3080 Ti. The crux of the article is that DLSS 3 drags performance down because it is computationally expensive.

H.K

"There is no claim at all that 4080 S = 3080 Ti in non-DLSS scenarios. If you look at the bottom half of the first image, games without frame generation show the 4080 S as being ~41% ahead of the 3080 Ti. The crux of the article is that DLSS 3 drags performance down because it is computationally expensive."

It's really not an apples to apples comparison. The DLSS data should not be compared and put up against non DLSS data at all. DLSS 3.0 not only uses up-scaling, It also uses frame generation. So even if you take out the generated frames, the frames left are up scaled frames. So from a quality perspective they are really not comparable to frames which are rendered at native resolution.

As to how much performance you're loosing due to frame generation, I think that's measurable (possibly?) by comparing DLSS 2.0 and DLSS 3.0 data at the same level of up-scaling. But even then It's really hard to compare. It would just come  down to latency relative to the amount of frames you're getting which with DLSS 3.0 you would still be getting lower latency compared to native (due to up-scaling) but not as LOW of a latency with NO frame generation for the same amount of frames. For example:
If you are getting FHD upscaled to 2K resulting in: 180fps+60ms latency with DLSS 3,
Without DLSS at FHD no upscaling, If you get 180fps your latency should be much lower.

Quick Reply

Warning: this topic has not been posted in for at least 120 days.
Unless you're sure you want to reply, please consider starting a new topic.

Name:
Email:
Verification:
Please leave this box empty:

Shortcuts: ALT+S post or ALT+P preview