News:

Willkommen im Notebookcheck.com Forum! Hier können sie über alle unsere Artikel und allgemein über Notebook relevante Dinge disuktieren. Viel Spass!

Main Menu

RTX 4090 performs surprisingly well at 8K with DLSS 3 enabled

Started by Redaktion, October 14, 2022, 10:01:20

Previous topic - Next topic

Redaktion

Nvidia released the RTX 4090 on September 20. Aside from the absurdly high price, the GPU has gotten stellar reviews across the board. Not only does the card produce incredible fps numbers at 4K, but it can also do 8K/60 fps in some titles as shown recently by The Tech Chap on YouTube.

https://www.notebookcheck.net/RTX-4090-performs-surprisingly-well-at-8K-with-DLSS-3-enabled.661426.0.html

RobertJasiek

Tech Notice has also done some compute benchmarks, such as V-Ray, and detected roughly 2x the performance of a 3090. If speed were the only consideration, the 4090 would be a no brainer.

Different people have measured very different dB noise levels but noise samples at load suggest that 4090 is not exactly silent but also not loud. Some might accept the noise - I would not.

Depending on softwares, actual TDP varies from 230 to 450W non-OC. Everybody needs to decide whether this is responsible to the climate and acceptable on the power bill.

Besides missing availability, I think the main problem is the weight. Even with good stands, bendgates might be coming after years of use. Really, no such PCIe GPU should be designed. Instead it should be a full computer within the then somewhat larger GPU case.

LL

QuoteDepending on softwares, actual TDP varies from 230 to 450W non-OC. Everybody needs to decide whether this is responsible to the climate and acceptable on the power bill.

Engage your brain Robert! please... :o) 
A PC with a 4090 rendering in GPU is several fold more economical than an Apple M1 in Blender.

Would you warn users against "bla bla climate, bills" using Apple M1 for rendering?  :

Benchmark median score of a 4090 in Blender  12249 points
Benchmark median socre of a Apple M1 Ultra (GPU - 64 cores) is 1371.

So for rendering same scene Apple will have to be 9x longer time at maximum power. To even achieve same energy efficiency will need to be 9x more economical than a PC with 4090 while rendering.


And btw TDP is the same as a 3090ti while faster. So more efficent.

RobertJasiek

If only efficiency is the aim, presumably the fastest, most expensive server cards (Hopper...) win.

Now, if we change aims and add, say, €3000 as the upper limit for a GPU price, then RTX 4090 might be the most efficient in the sense of application data speed per watt. If some heavy task needs to be done regularly and this price limit and efficiency are the only criteria, this card is the right choice.

However, for less heavy tasks, lower frequency of its occurrence or without strong need to do such tasks, that is, for most GPU users, a GPU consuming less watt per hour is good enough. Despite its lower efficiency, it will be better for the climate.


Randomonium

DLSS 3 has to be treated differently than any other fps boosting tech. While other tech like DLSS 2 or FSR increase fps, they also decrease latency. DLSS 3 does the opposite. It increases latency. When you are playing at 60 fps, it feels like you are playing at 30 fps. So it's extremely misleading. DLSS 3 is only useful at very high fps, where the latency is small enough that it doesn't matter, so you can boost fps even higher for high refresh rate monitors.

Quick Reply

Warning: this topic has not been posted in for at least 120 days.
Unless you're sure you want to reply, please consider starting a new topic.

Name:
Email:
Verification:
Please leave this box empty:

Shortcuts: ALT+S post or ALT+P preview