News:

Willkommen im Notebookcheck.com Forum! Hier können sie über alle unsere Artikel und allgemein über Notebook relevante Dinge disuktieren. Viel Spass!

Main Menu

I believe in ray tracing, but I do not believe in Nvidia's RTX 3000-series GPUs

Started by Redaktion, December 13, 2020, 04:46:19

Previous topic - Next topic

Vincent Weis

For those of you who are complaining about the article citing anecdotes and benchmarks without statistics, look it up. The author has a valid point.

Switching on RTX may be valid if you're still playing at a lower resolution like 1080p, but the RTX series cards are sold on the promise of taking us from our original world of 1080p fake lighting games at 30-60 fps, to a world in which even entry level cards can take us to near-4K 60-120 FPS gaming with real time ray-traced shadows.

That isn't a reality at all.

Even on the RTX 3080, there is some heavy visual compromises that have to be made in order to consistently hit those targets. The promise of Raytracing in a vacuum is really really impressive, but we can't pretend that this isn't a leap too far, to try to jump to widespread ray tracing adoption while also pushing a MASSIVE jump in normal performance metrics. 4K is a ridiculously huge, mostly pointless leap in performance, but more people are still targeting it instead of 1440p. 60 FPS to 120 fps is a difficult leap, but that's going to be the new bar for competitive games in a couple years.

We aren't going to hit all three of these targets simultaneously, and DLSS isn't a magic bullet to solve this problem, because it isn't universal. It has support in new AAA titles, but it won't come to everything, because it has to be something NVIDIA explicitly builds machine learning models for. Don't get me wrong, DLSS is truly amazing, it's the only way to get 4K 60 with Raytracing on consistently, and that's amazing. But we can't assume it'll be there to hold our hand and guide us to the promised land of perfect performance, and the RTX 3000 series can't get there without DLSS unless games are as optimized as the best optimized games out there, and you have a 3090.

NickH

Even if good points were made, it's hard to trust a tech writer with a mobile version of a 2060. Do they make a worse RTX card? Maybe you need to ask for a raise or a side hustle. Just saying, you might find it easier to buy a better graphics card than to change a corporation. I play with RT on in all single player games. It's this generation. Not with a 2060 though. That would just be a silly purchase. It's hard for me to trust anyone's opinion that owns a 2060 for RT. Maybe that's fine if you're right on money and only care about Minecraft at 1080, but that's a real niche audience. My gut tells me you just make bad choices. On another note, I really hope AMD steps up their ray tracing game. Even the author knows RT is the future.

Mapilo

Super polarising comments below. Exactly what Linus said that would happen once Nvidia sent that stupid email.

Dodobird

Ray tracing is the future! But, I believed everyone is being fooled with new technologies. Nvidia and AMD advertising their technology can do this and that. I definitely believe that they are a few years away from truly delivering this Ray Tracing technology.
I've been a gamer a long time, mostly on console. My prime days were with Super Nintendo and sega genesis. But then, nobody complained about the hardware. Games were good.
Then about 8 years or so I was introduced to pc gaming. It is awesome. But that comes with a heavy price tag and the performance doesn't show. You can spend $1000 or $5000 on a rig and that still doesn't guarantee anyone, that will have consistent performance. You can see that video from Linus tech. He get a fresh, freaking expensive oringin pc and it fails to deliver. 
At least with consoles, you will pretty much the same performance on all consoles. Shitty or good performance, it will all be the same. Consistency.
At least for now, I feel like giving up on pc gaming cause no matter what I spend, consistency in performance is something that will still be missing.

Just a post

Dang, all the people saying that the OP is just poor and should get a better job have absolutely zero clue what he is even talking about lol.  The complaints about Nvidia have nothing to do with your revenue but EVERYTHING to do with Nvidia's lack of care for their own market.  If I bought a 3090 right now (for over $1300), I'd get around 70fps on Cyberpunk with RTX enabled.  Now that's pretty dumb lmao.  Oh, and if you don't have the 3090, you'll get around 5fps with RTX enabled and only 60fps with RTX disabled if you're lucky lmao. 

Look at benchmarks - this is all, unfortunately, true.

Nickle

First off, I really enjoyed the article and thought it was engaging.

But I'm nitpicking here because power consumption is a soft limit. We can always buy, and people can always make, bigger power supplies, even though it may not be practical. Now reducing transistor size is a hard limit. We may only see 7nm for a long time now and 5nm or even 4nm may be the hard limit. At this point the silicon layers are just a few atoms thick, and you can't make atoms smaller than atoms.

tenz

man u maybe be true but hey look i will only play with 1080p with my rtx 3060ti with ray tracing so, thats enough i guess
and ray tracing is still in developing.

Counter shader

What are you talking about sacrificing resolution and visuals kind of monitor and stuff are you playing on I play on 3480x1600 I have no visual degrading visuals clarity textures nothing when I have rtx on lol

Kenneth

Dude you are clearly just looking for attention and have no idea what you are talking about. I have a 3090 and I can play any games even cyberpunk at 4k 60 fps with everything maxed including ray tracing, so I don't know what performance hit you are talking about on the 30 series but you clearly did no research.

Alin

I bought a pc with RTX 3070. I was blown away playing CBP2077. I do agree with the statements here tho. That raytracing is the future.. not sure it's here in the now. It really kills the game with it on.

kz

Regardless of whether or not I'm in agreement with the central argument, this is one of the single most terribly written tech articles I've seen since "Just buy it". This piece has the organization and writing quality of a high schooler's book report, with a logical thought process to match.

As many others have rightly pointed out, you have an RTX 2060 Mobile, which is the single lowest end RTX card in existence, bar maybe the RTX 2060 Max-Q. The thing has just over a third the performance in both rasterization and raytracing heavy games and benchmarks than even the last-gen 2080 ti. Now, your experiences with this GPU aren't invalid - there's a good argument to be made here that Nvidia never should have released it as an RTX-enabled card. But to then go and spin that into "No RTX cards are powerful enough, because mine isn't"? That's garbage.

Here's an idea - go get some experience with actual RTX 3000 cards, or even maybe just look up some benchmarks, before drawing conclusions about them.

Your second point goes so far out of the way of the main idea of this article that frankly I'm not sure why it's there. Your entire argument is "current-gen RTX cards are not powerful enough to run RTX", so why would you then go and put an entire paragraph talking about how current-gen RTX titles look bad? What does that have to do with anything? There are lots of issues with the way that paragraph was written too, but I'll choose to ignore them since, like I said, that paragraph is irrelevant.

And the third point - now this just has me confused. See, I can interpret this "efficiency gains" in two ways. One is "The RTX 3000 series only provides modest gains in RTX performance." The other is "The RTX 3000 series only provides modest gains in performance per watt".

If it's the former, then that's absolutely just not true. The RTX 3080 is nearly 50% more powerful in both Port Royal and the 3DMark DXR test than the 2080 ti - that's huge. Nvidia put a huge focus on RTX performance this generation, and that clearly shows.

I don't know what you would have expected - should the 3000 series have been 300% faster? 1000%? What's your cutoff for no longer being "modest gains"?

If it's the latter, then yes - the RTX 3000 series have worryingly high power consumption, and as a result overall performance per watt hasn't gone up much. But why is that relevant to performance? You're arguing here that the RTX 3000 series cards are not powerful enough, not that they're not efficient enough. The fact that the 3080 draws 350W isn't going to stop it from running games 3 years into the future.

And speaking of which - that's the last thing for today that I take issue with. "[The RTX 3000-series GPUs] offer neither enough rasterisation performance nor enough RTX performance to run very demanding ray-tracing-focused titles of the next 3 to 5 years."

You're absolutely right. The RTX 3000 series won't be enough to max a 2024 RTX game.

Hey, remind me again, can the GTX 980 max Valhalla at 1440p? What about RDR2? How about Far Cry 5?

But those are all pure rasterization titles!

Face it - it's never been the case that a 3-5 year old GPU can comfortably max all curreng-gen titles. You turn down some settings and continue on with your life. That's going to be the case for the 3080 as well - you might get used to RTX Ultra right now, but in 5 years you'll have to turn that down to RTX Medium or Low, and reduce a bunch of other graphics settings as well. That's just the way things are.

And that's why the premise of this article makes no sense. "what I do not believe is that Nvidia's RTX 3000-series GPUs will be a meaningful part of that future". You're right - once the RTX 4000 series are released, the 3000 series will be obsolete. 5000 series will make 4000 series obsolete. And so on, and so forth. We don't stick in this game of PC hardware for what current-gen hardware has to offer us 5 years into the future, expecting that it will always remain the latest and greatest - we're concerned about what it offers us now.

This article is poorly thought out, poorly structured, and poorly written, and does not reflect the standard of writing I'd expect from Notebookcheck. Even with "Views, thoughts, and opinions expressed in the text belong solely to the author," it does no favors for Notebookcheck's credibility that an article such as this one is allowed to be published.

Sean Matheis

Here's an idea, how about you actually get yourself a 3000 series before you spread a bunch of completely inaccurate "opinions". I mean seriously judging Ray tracing off a 2060 mobile? What next gonna judge a sports car by driving around a golf cart? Because as a few people here pointed out I also have a 3090 which does cyberpunk 2077 in 4k @ 60fps with Ray tracing so yah don't know what performance hit your talking about because the major difference between the 2000 and 3000 series is the 3000 series is designed to use dlss 2.0 in conjunction with Ray tracing in order to actually deliver higher frame rates at higher resolutions with Ray tracing on, something the 2000 series in incapable of. You however are clearly unaware of this and are looking at the non dlss benchmarks as the ones to look at not understanding that dlss 2.0 actually creates a sharper, higher fidelity image than a "native 4k" image meaning the only reason to not use dlss 2.0 with Ray tracing is well... If you don't know what your doing and aren't qualified to write articles on the subject.

Fisherman01

L. M. A. O.

I hope some of y'all are getting paid by Nvidia for these comments. The point of the article is the author's doubt over the 3000 series effectiveness in future RTX titles.

As an owner of a 3080 and playing at 1440p. I can play CP2077 at maxed settings with RT on and I need to have DLSS (Auto) on to hit that 60fps target. I'm not complaining, I'm already privileged to have a 3080 at all. The point of the article still stands though, when considering the the 30 series as a whole you can see the problem looming in the future.

Generally PC games strive for at least 60 FPS. If you do not believe this to be the case, I'm willing to hear your arguments, but generally it seems us PC folks like our smooth at least 60fps gameplay.

Now as you could use information currently available to us to infer how things are going to look in the future. The performance on a 3070 and 3060 ti (two brand new cards I might add) get 60 fps with ultra settings in CP2077 with both RT and DLSS on. Here's the catch, these are the results at 1080p. You think somebody spent at minimum $500 USD on a 3070 to play at 1080p? Is it unreasonable to expect that the 30 series cards should be able to play at 60 fps on their 1440p or 4k monitor with a $500 dollar GPU? (Hint: You can with RT disabled) It seems that a reasonable person could assume that if 30 series cards are at their limits for RT titles currently that the future isn't bright for their RT dreams. Obviously, you can drop your other settings from Ultra to be able to have RT on and try to hit the 60 fps target but why would you when the single biggest hit to performance is going to be RT and simply disabling will get you there alone. Another disconcerting thing is the amount of VRAM on these cards seems it won't be enough for upcoming RT titles (see HWU video Cyberpunk 2077 Ray Tracing and DLSS Benchmark, What GPU You Need For 1080p, 1440p, 4K
for info on that).

Benchmarks for points I've made are from TechSpots article. (can't post links but you can look it up)

tldr; Keep malding Nvidia shills. This article doesn't make your 30 series card not great, they're just not the RT beasts they're cracked up to be (and that's ok)


k2kambo

People need to chill. Hes only saying rt isnt as great as it could be and rtx 3000 series wont set the bar for future games. As for people bashing on him for saying "what lower resolution, I get 60fps with rt on and dlss on at 4k/2k." You're already wrong cause you're not truly playing on 2k/4k when dlss is on. Dlss is just nvidias upscaling tech like what TVs already have. It's the only reason you see a boost in performance cause it renders at a lower resolution then upscalss it. But I do agree 2060 mobile is a bad  comparison/product for ray tracing. It would be equivalent to like a 1650 desktop gfx card or worse.

Valaska

Okay I'd this some sort of multi millionaire moot between all you tech reviewers?  All because hardware unboxed refused to show hugely important features of its card so Nvidia stopped sending them FREE gpus?  Gpus that two 3D modellers I work with have been able to get while people like jayz have been sent SIX of them so he can ln2 them like these weren't vital hardware components to running industry? You know what, good. They should stop sending review copies out, these should be purchased to avoid bias and the sure as hell shouldn't be given out to reviewers when consumers can't get the.

Also of you think these aren't capable of raytracing then explain how they function perfectly fine in 1080, 1440, 2k and he'll the 3090 even managed 4k? Traditionally I buy and parts since I need to buy large amounts of cards for work, but it's undeniable that the 3000 can easily handle DXR workloads .
Did you just feel some compulsion to ride on this band wagon and wrote this nothing burger of an article? I'm starting to agree more and more with what Nvidia did tbh. All reviewers should be cut off or not given permanent sample. Justmail one card out and make them pass it around.

Quick Reply

Warning: this topic has not been posted in for at least 120 days.
Unless you're sure you want to reply, please consider starting a new topic.

Name:
Email:
Verification:
Please leave this box empty:

Shortcuts: ALT+S post or ALT+P preview