News:

Willkommen im Notebookcheck.com Forum! Hier können sie über alle unsere Artikel und allgemein über Notebook relevante Dinge disuktieren. Viel Spass!

Main Menu

I believe in ray tracing, but I do not believe in Nvidia's RTX 3000-series GPUs

Started by Redaktion, December 13, 2020, 04:46:19

Previous topic - Next topic

RyanD

Here's what I don't understand about Cyberpunk and Raytracing issues. I just read someone is pulling 50-55 fps with maxed out settings DLSS on quality with a 3090. I'm making the exact same fps, with exact same settings maxed out on a 2070 super. How? Granted I don't think he states his resolution but mine at 1080p.

Galy800

I think you're right, i have a 2080 Ti (a graphic card superior than 3070 basically for the VRAM) and i have big problems with the frame rate and raytracing, Although the card is very powerful, it suffers from large fps drops when I activate the RTX is on, in cyberpunk i can play ultra and 4k by 60-70 fps or ultra with rtx in ultra in 1080p by 40-60 fps (dlss in quality because de performance mode is fckn horrible) its a huge drop because if i play in 1080p without RTXon i literally have more than 110 fps. Obviously that doesn't mind that i cant play with rtx at 60 fps or more in other games but allways the drop is huge, in Metro Exodus i can play 60-70fps extreme graphics and RTXon, without the RTX i get 144 fps, the maximum frame rate that my monitor support. same with control, shadow of the tomb rider, watch dogs legion, minecraft RTX, Quake II RTX, Deliver Us The Moon, Wolfestein Youngblood, COD MW, COD BOCW, BF5, Bright Memory, DIRT 5 & obvously cyberpunk. but im not that pesimist, i belive in Nvidia to do a nice work in RTX 4000 and bring us more RTX cores MORE VRAM and more cuda cores, the VRAM i think is one of the big problems of the RTX 3000 cards.

Damian

Quote from: Star on December 13, 2020, 05:31:27
You misunderstand the point. The point is that the RTX 3000-series GPUs will struggle with ray tracing in the future just as much as the RTX 2000-series GPUs--especially low-end ones--do today.
Thing is, most people can't even afford a laptop with RTX 2060, they buy with 1650, 1650 Ti, 5300M or 5500M. And even though that card isn't cheap, it performs poorly in ray-tracing. Nvidia promised us ray-tracing and DLSS. There are only a few games with both and things don't look good. Power efficiency of Ampere is disappointing.

Damian

Quote from: Peter McLean on December 13, 2020, 14:34:52
I cannot believe the author is complaining about RT performance on their 2060 mobile. You literally have the absolute worst RTX enabled card. What do you expect? What are you doing using a laptop for gaming and writing reviews about brand new AAA titles.

Please do not post anything else from this author. You're just trying to click bait people to your site. Shame on you.
I own a 3000 series card upgraded from a 2070 super on a 5120x1440 ultrawide and it has a huge performance boost.

Write articles about something you know, because you clearly shouldn't be writing about gaming, video cards or computer hardware.
Nvidia claimed 3080 to be 100% faster than 2080 which is not and claimed 90% more efficient which is not.

Nvidia also has a problem with people who prefer 120 fps gaming than ray-tracing enabled. 6900 XT is a better choice for them.


Andy R

@ Todd H
You say you care about a smooth experience but then say sub 60fps is fine... I'd personally say anything below 60fps is poor. I'm guessing this is where your love of 4k is coming from.

The reason I and a lot of other people prefer 1440p is not due budget requirements at all but due to a preference for smoothness (144hz/144 fps) and graphical detail over resolution, as you've stated even with the 3090 you can't have both in all games. I could have saved myself a pretty penny when I got my 2k 144hz monitor and gone for a 4k 60hz instead. But I looked at the difference between the two (and tested two different 4k 60hz monitors) and it was pointless me having a 27-30" 4k monitor as there is no noticeable difference when you're talking those sizes as far as resolution goes, but there is very noticeable difference between any sort of action game (not just First person shooters) running at 60hz vs 144hz. Fair enough if we're talking 50" and up TV's then yes there would be a point in 4k, but I personally have never used my TV for PC gaming and never would. Also anything above 30" is just too big for desk use IMO.

Also you seem to contradict yourself you say FPS don't matter then say "Resident Evil, RPGs, adventure games, fighting games, sports games need a smooth experience and not an immersion breaking stuttering" one of the main and most obvious causes of stuttering gameplay is having low FPS so I'm not sure what you're trying to say there.

Don't get me wrong I would love to have 160hz 8k with all settings maxed out but the reality for now is that for now 144hz 2k maxed out is about the best we can hope for, or with your choices 60hz 4k maxed out.

MachineShedFred

"The 30-series add-in-boards are crap because my last-gen mobile GPU can't do it."

That's what I read from this article.  Nice logic.

Hey author, you realize there's a reason that these guys are strapping those huge heat sinks onto the sides of those GPUs, which your laptop doesn't have, right?  And that mobile GPUs have completely different design constraints than what they put into desktops?  Probably because in a desktop the GPU can use 10x the power that your entire laptop does when running full throttle?  And the primary purpose for that energy isn't to create heat for a fan to blow out the back - it's doing work with it.

And quite possibly a 1st gen mobile implementation of a state-of-the-art technology, where power consumption is a primary concern so that the laptop's battery isn't run flat in 15 minutes while scorching the owner's thighs isn't exactly the highest performer in that technology.

Also, drivers and game developers never improve their implementations with time and more wide use in the public.  Nope, the same half-assed early implementations are the only way to ever do it, apparently.  Why would a software engineering manager spend his team's valuable time on a flawless implementation of a early-life feature that 1% of GPUs support properly, with drivers that are only half-baked?  When that number becomes 10% and the driver matures, the results change.  We've seen this over and over in the GPU space for 20 years.

What a piece of trash this article is.  Completely foolish assertion with unbelievably boorish conclusions drawn from it.

Spunjji

Wow, this one really brought in all the comments. To address the most obviously wrong-headed responses:

It makes no sense to defend the 2060 mobile's useless RTX performance just because it's "cheap". Look at the cost difference between designs with the RTX 2060 and the GTX 1060, its direct predecessor, or even the performance-equivalent 1070. Then check the cost difference from the 2060 to the 1660Ti vs. the (relative) lack of performance difference. Nvidia added features that increased the die size of their mid-range chip and they *charged for it*; it's no defence at all to point to it being the bottom of the RTX range to justify those features being useless - they could have easily made that the 2070 the bottom-end RTX card and kept the 2060 as a GTX chip, but they didn't, and that makes it fair game for a critique.

As for the criticisms about the 3000 series not running future RT games, people seem to be deliberately missing the point here when they talk about their own habits of changing hard every 2-3 generations. The fact is that you can still run AAA rasterized games with modern features perfectly acceptably on an R9 290X from 2013, but I very much doubt you'll be able to run RT titles from 2027 acceptably on *any* 3000 series card. For context, the 3080 has an equivalent MSRP to the R9 290X when you account for inflation. I expected that situation with the beta-test 2000 series, but the 3000 series really did not move the needle much at all on Rt performance and that really sucks.

Nvidia are selling top-end cars for top-end money, and insisting that reviewers talk about the "future" of RT, but these cards simply will not be a meaningful part of that future. These are just the facts.

Spunjji

Quote from: MachineShedFred on December 14, 2020, 15:12:42
"The 30-series add-in-boards are crap because my last-gen mobile GPU can't do it."

That's what I read from this article.  Nice logic.

Nice reading comprehension, bozo. That's what's known as a straw man, because it's not even close to the argument being made.

To summarise:
What a piece of trash this [comment] is.  Completely foolish assertion with unbelievably boorish conclusions drawn from it.

Qisreo

Looking at all the image shots including the ones in this article makes me question why I should even care about ray tracing. I honestly can't say if the two images of the woman in this article are so different that I might prefer one. In fact, I find the one without RT to be more realistic.

I accept the immature state of RT, so I'm open to seeing how it develops. More and more games that are attractive to me lately haven't been photo-realistic stuff. I'm enjoying better stories and interactions and find them way more engaging than gawking at some shiny spots. I remember how disappointing Doom 3 was when it came out, the graphics were way too dark and the game play was just lather-rinse-repeat. Unless the average game development cycle is lengthened to allow for more time for both graphics and narrative development, I see this as partly a zero-sum game where prettier graphics = shittier narrative and interactions. In the last decade we also witnessed the rise of E-sports and a large portion of gamers went that route. For that, I can't see the need for RT.

RT is cool in theory and in 5 years it might be cheap, accessible and practical for most things. For now, I really don't care. DLSS is neat, but deep learning is infiltrating into most everything we do and graphics is no exception.

Sora

This depends... How can I say it, if you play at 1440p RT On... Is more than fine, 4K is the problem, but with DLSS thats less a problem.

The future would be with dlss 3.0 and its implementations, all the current 30 series would have a big improve, the problem is that nvidia didn't have that particular tech available at launch.

Its true that currently RT is not the most plausible if you play at 4K but is just second gen rt on nvidia and first gen on amd, so people shouldn't expect perfect performance just yet, most APIs features weren't usable in the first years of last gen (heck, you could even say it took way more time).

There are games that have problem implementing good use of intel higher clock or the more threats of the ryzen processors.

So waiting is best right know which is not good to say that for consumers but thats literally the only deal.

I believe in nvidia technology and is the way to go, so I'll buy a 30 series card but I know what I'll got for what the tech is in this moment, the hardware improve is big, but the software side of things is not yet 100% :/, if the dx12U is coold thats fine but if the software of the card is not completely optimized then you should not expect the best performance.

Nytebyrd

Most of this seems very clearly to be an issue of the struggle to make the leap to higher resolutions. With tweaking, I was able to get my 2080 super to average around 90fps at 2560×1440 in Cyberpunk w/ 'medium' ie low rtx settings and mostly high or ultra settings.

That is likely due to Asus's new bios for my mobo allowing the equivalent of SAM, liquid cooling and modest OCs abound. The thing is, at 4k every leap in performance is very incremental. Just on the software end, dlss 2.0 took a bigger leap at 4k than any of the hardware improvements made from either Nvidia OR AMD.

To the people crying 'this 7yo gpu is still viable (lmao no its not)' are overlooking reality. To get a picture of how the implementation of rt is going to play out over card cycles, you actually need to look back at the implementation of rasterization. This isn't 'just' new architecture, its an entirely new rendering technique for this medium. And when you do look back, you see much the same compared to whats going on today.

Terrible optimization, seemingly rendering brand new hardware obsolete. Existing games with hamfisted implementation making the new tech look like a step back. The one difference, games could have a muuuuch shorter development cycle back then. So adoption seemed to happen a faster.

Today, we have only just entered the actual adoption window. It's only now universally available across all major platforms. Which is why we see pretty much every game being released now supporting it. As opposed to six months ago, where less than 10% of all gaming machines had any sort of ray tracing support. Even then, most of those were on hardware that basically supported it in name only.

As an aside, even if AMD and Nvidia return to 15-25% generational cycle  leaps in performance, the 30 series will be obsolete in 2-3 cycles. In large part bc of the leap the 30 series itself made. That would be comparable to as much as the leap the 30 series itself made, nearly doubling the 30 series performance in 4 years.

TengkuJG

Do you know how much hard work Nvidia has done for us...for workers...for gamers...for the future...yeah, the fps sucks with RTX but that is just the beginning of a new generation. The Pandemic had almost ruined the company...the stocks are very low...and everyone is mad...but when you get your hands on the new RTX-30 series...now you will understand. Right now, Im also using an RTX 2060 on my pc...and when I play games with RTX in Modern Warfare...it changes a lot in terms of gameplay and graphics...real time illumination...flashbangs and explosions light up the surrounding to detect any enemy nearby using explosion light source...this is just the tip of the iceberg...and funny thing is that you complain about FPS with ray tracing on with your RTX 2060 mobile graphics card and yes, I do agree about performance issue...but you should be thankful and even though Cyberpunk 2077 with High settings and RTX on and you get only 55fps with dlss...then the only way to improve this is just set it to medium and RTX on with dlss quality...that should satisfy you.

klucvatnu

Quote from: TengkuJG on December 14, 2020, 19:05:28the stocks are very low...
It's very far from reality. Those who started 4 years back in the company are now millionaires, because stocks went up 18 times. NVIDIA went out of the stock drop a year ago and increased the stock price 3.5 times in a year. The stock performance is unprecedented, you will hardly find another company that is doing so well on the stock market.

Quote from: TengkuJG on December 14, 2020, 19:05:28Do you know how much hard work Nvidia has done for us
I actually do. Not enough. NVIDIA must fire half of its incompetent management. Its internal culture is in disarray. Because of the stock market the fast track to wealthy life attracted too much mediocrity, favoritism and internal circle everywhere, at the same time brilliant engineers left.

Honk

Being critical is not the same as being ungrateful. In fact, that's precisely what's needed now instead of ignoring issues.

Quote from: TengkuJG on December 14, 2020, 19:05:28
Do you know how much hard work Nvidia has done for us...for workers...for gamers...for the future...yeah, the fps sucks with RTX but that is just the beginning of a new generation. The Pandemic had almost ruined the company...the stocks are very low...and everyone is mad...but when you get your hands on the new RTX-30 series...now you will understand. Right now, Im also using an RTX 2060 on my pc...and when I play games with RTX in Modern Warfare...it changes a lot in terms of gameplay and graphics...real time illumination...flashbangs and explosions light up the surrounding to detect any enemy nearby using explosion light source...this is just the tip of the iceberg...and funny thing is that you complain about FPS with ray tracing on with your RTX 2060 mobile graphics card and yes, I do agree about performance issue...but you should be thankful and even though Cyberpunk 2077 with High settings and RTX on and you get only 55fps with dlss...then the only way to improve this is just set it to medium and RTX on with dlss quality...that should satisfy you.

Quick Reply

Warning: this topic has not been posted in for at least 120 days.
Unless you're sure you want to reply, please consider starting a new topic.

Name:
Email:
Verification:
Please leave this box empty:

Shortcuts: ALT+S post or ALT+P preview