News:

Willkommen im Notebookcheck.com Forum! Hier können sie über alle unsere Artikel und allgemein über Notebook relevante Dinge disuktieren. Viel Spass!

Main Menu

Post reply

The message has the following error or errors that must be corrected before continuing:
Warning: this topic has not been posted in for at least 120 days.
Unless you're sure you want to reply, please consider starting a new topic.
Other options
Verification:
Please leave this box empty:

Shortcuts: ALT+S post or ALT+P preview

Topic summary

Posted by 90s Gamer
 - December 15, 2020, 16:08:51
Thing is, to fully path-trace actually photorealistic games in 16K VR 240 fps, we would need a hundred thousand times more rays per second than Turing. Unfortunately (as seen in Control), Ampere doesn't even double Turing's ray-tracing performance.

With doubling every two years, it will take 33 years. With +60% every two years (like Ampere), it's gonna take as long as 50 years. I find 50 years unreasonable and unacceptable and that's assuming they won't go below 60% because of the end of transistor shrinking.
Posted by Francois
 - December 15, 2020, 13:53:50
Exactly, European people are much smarter. We built, um, better GPUs. Our GPUs are awesome and much faster than American branded stuff like Nvidia or AMD or Intel or Apple. That's why capitalist Americans haven't even heard of our awesomeness. Our product prices are high to begin with, so we can't even tell the difference between scalping and regular pricing. Silly Murica.

Quote from: Smart European Customer on December 15, 2020, 05:05:14
Muricans are not too smart, like wtf.. capitalism fried your tiny yet overweight brains.

You are so good, you just spent 1000$ on 30 series to play 2018 rtx titles under 60 FPS (unless u toggle some shi**y upscaler called DLSS)

In a year or less, your 30 series will be incapable of running new rtx games. Lol, it already cant hold 60FPS with RTX and no DLSS in CP2077 xDD
And you will have to spend 1000$ again just to play RTX titles released past 2 years with 40fps.

Thanks for the USA, the only place u can so easilyy make money on shi**y products - manipulation and marketing is all u need if u wanna sell product to muricans.
Posted by Smart European Customer
 - December 15, 2020, 05:05:14
Muricans are not too smart, like wtf.. capitalism fried your tiny yet overweight brains.

You are so good, you just spent 1000$ on 30 series to play 2018 rtx titles under 60 FPS (unless u toggle some shi**y upscaler called DLSS)

In a year or less, your 30 series will be incapable of running new rtx games. Lol, it already cant hold 60FPS with RTX and no DLSS in CP2077 xDD
And you will have to spend 1000$ again just to play RTX titles released past 2 years with 40fps.

Thanks for the USA, the only place u can so easilyy make money on shi**y products - manipulation and marketing is all u need if u wanna sell product to muricans.
Posted by Eo09
 - December 14, 2020, 23:41:59
Nvidia and Intel have been sharing best practices? I saw the first half of Linus's rant and he actuallt made a ton of good points. RT is still in the oven and we will see in 5 years. Lots of innovations did not pan out in the mainstream.
Posted by S.Yu
 - December 14, 2020, 21:27:22
Wow, 6 pages of comments in a day, this has got to be (at least) the hottest article of the year on this site! I was gonna say something else but forgot what I was gonna say seeing all these comments.
Posted by Honk
 - December 14, 2020, 19:52:46
Being critical is not the same as being ungrateful. In fact, that's precisely what's needed now instead of ignoring issues.

Quote from: TengkuJG on December 14, 2020, 19:05:28
Do you know how much hard work Nvidia has done for us...for workers...for gamers...for the future...yeah, the fps sucks with RTX but that is just the beginning of a new generation. The Pandemic had almost ruined the company...the stocks are very low...and everyone is mad...but when you get your hands on the new RTX-30 series...now you will understand. Right now, Im also using an RTX 2060 on my pc...and when I play games with RTX in Modern Warfare...it changes a lot in terms of gameplay and graphics...real time illumination...flashbangs and explosions light up the surrounding to detect any enemy nearby using explosion light source...this is just the tip of the iceberg...and funny thing is that you complain about FPS with ray tracing on with your RTX 2060 mobile graphics card and yes, I do agree about performance issue...but you should be thankful and even though Cyberpunk 2077 with High settings and RTX on and you get only 55fps with dlss...then the only way to improve this is just set it to medium and RTX on with dlss quality...that should satisfy you.
Posted by klucvatnu
 - December 14, 2020, 19:51:28
Quote from: TengkuJG on December 14, 2020, 19:05:28the stocks are very low...
It's very far from reality. Those who started 4 years back in the company are now millionaires, because stocks went up 18 times. NVIDIA went out of the stock drop a year ago and increased the stock price 3.5 times in a year. The stock performance is unprecedented, you will hardly find another company that is doing so well on the stock market.

Quote from: TengkuJG on December 14, 2020, 19:05:28Do you know how much hard work Nvidia has done for us
I actually do. Not enough. NVIDIA must fire half of its incompetent management. Its internal culture is in disarray. Because of the stock market the fast track to wealthy life attracted too much mediocrity, favoritism and internal circle everywhere, at the same time brilliant engineers left.
Posted by TengkuJG
 - December 14, 2020, 19:05:28
Do you know how much hard work Nvidia has done for us...for workers...for gamers...for the future...yeah, the fps sucks with RTX but that is just the beginning of a new generation. The Pandemic had almost ruined the company...the stocks are very low...and everyone is mad...but when you get your hands on the new RTX-30 series...now you will understand. Right now, Im also using an RTX 2060 on my pc...and when I play games with RTX in Modern Warfare...it changes a lot in terms of gameplay and graphics...real time illumination...flashbangs and explosions light up the surrounding to detect any enemy nearby using explosion light source...this is just the tip of the iceberg...and funny thing is that you complain about FPS with ray tracing on with your RTX 2060 mobile graphics card and yes, I do agree about performance issue...but you should be thankful and even though Cyberpunk 2077 with High settings and RTX on and you get only 55fps with dlss...then the only way to improve this is just set it to medium and RTX on with dlss quality...that should satisfy you.
Posted by Nytebyrd
 - December 14, 2020, 18:42:33
Most of this seems very clearly to be an issue of the struggle to make the leap to higher resolutions. With tweaking, I was able to get my 2080 super to average around 90fps at 2560×1440 in Cyberpunk w/ 'medium' ie low rtx settings and mostly high or ultra settings.

That is likely due to Asus's new bios for my mobo allowing the equivalent of SAM, liquid cooling and modest OCs abound. The thing is, at 4k every leap in performance is very incremental. Just on the software end, dlss 2.0 took a bigger leap at 4k than any of the hardware improvements made from either Nvidia OR AMD.

To the people crying 'this 7yo gpu is still viable (lmao no its not)' are overlooking reality. To get a picture of how the implementation of rt is going to play out over card cycles, you actually need to look back at the implementation of rasterization. This isn't 'just' new architecture, its an entirely new rendering technique for this medium. And when you do look back, you see much the same compared to whats going on today.

Terrible optimization, seemingly rendering brand new hardware obsolete. Existing games with hamfisted implementation making the new tech look like a step back. The one difference, games could have a muuuuch shorter development cycle back then. So adoption seemed to happen a faster.

Today, we have only just entered the actual adoption window. It's only now universally available across all major platforms. Which is why we see pretty much every game being released now supporting it. As opposed to six months ago, where less than 10% of all gaming machines had any sort of ray tracing support. Even then, most of those were on hardware that basically supported it in name only.

As an aside, even if AMD and Nvidia return to 15-25% generational cycle  leaps in performance, the 30 series will be obsolete in 2-3 cycles. In large part bc of the leap the 30 series itself made. That would be comparable to as much as the leap the 30 series itself made, nearly doubling the 30 series performance in 4 years.
Posted by Sora
 - December 14, 2020, 17:09:10
This depends... How can I say it, if you play at 1440p RT On... Is more than fine, 4K is the problem, but with DLSS thats less a problem.

The future would be with dlss 3.0 and its implementations, all the current 30 series would have a big improve, the problem is that nvidia didn't have that particular tech available at launch.

Its true that currently RT is not the most plausible if you play at 4K but is just second gen rt on nvidia and first gen on amd, so people shouldn't expect perfect performance just yet, most APIs features weren't usable in the first years of last gen (heck, you could even say it took way more time).

There are games that have problem implementing good use of intel higher clock or the more threats of the ryzen processors.

So waiting is best right know which is not good to say that for consumers but thats literally the only deal.

I believe in nvidia technology and is the way to go, so I'll buy a 30 series card but I know what I'll got for what the tech is in this moment, the hardware improve is big, but the software side of things is not yet 100% :/, if the dx12U is coold thats fine but if the software of the card is not completely optimized then you should not expect the best performance.
Posted by Qisreo
 - December 14, 2020, 16:44:38
Looking at all the image shots including the ones in this article makes me question why I should even care about ray tracing. I honestly can't say if the two images of the woman in this article are so different that I might prefer one. In fact, I find the one without RT to be more realistic.

I accept the immature state of RT, so I'm open to seeing how it develops. More and more games that are attractive to me lately haven't been photo-realistic stuff. I'm enjoying better stories and interactions and find them way more engaging than gawking at some shiny spots. I remember how disappointing Doom 3 was when it came out, the graphics were way too dark and the game play was just lather-rinse-repeat. Unless the average game development cycle is lengthened to allow for more time for both graphics and narrative development, I see this as partly a zero-sum game where prettier graphics = shittier narrative and interactions. In the last decade we also witnessed the rise of E-sports and a large portion of gamers went that route. For that, I can't see the need for RT.

RT is cool in theory and in 5 years it might be cheap, accessible and practical for most things. For now, I really don't care. DLSS is neat, but deep learning is infiltrating into most everything we do and graphics is no exception.
Posted by Spunjji
 - December 14, 2020, 15:31:03
Quote from: MachineShedFred on December 14, 2020, 15:12:42
"The 30-series add-in-boards are crap because my last-gen mobile GPU can't do it."

That's what I read from this article.  Nice logic.

Nice reading comprehension, bozo. That's what's known as a straw man, because it's not even close to the argument being made.

To summarise:
What a piece of trash this [comment] is.  Completely foolish assertion with unbelievably boorish conclusions drawn from it.
Posted by Spunjji
 - December 14, 2020, 15:28:36
Wow, this one really brought in all the comments. To address the most obviously wrong-headed responses:

It makes no sense to defend the 2060 mobile's useless RTX performance just because it's "cheap". Look at the cost difference between designs with the RTX 2060 and the GTX 1060, its direct predecessor, or even the performance-equivalent 1070. Then check the cost difference from the 2060 to the 1660Ti vs. the (relative) lack of performance difference. Nvidia added features that increased the die size of their mid-range chip and they *charged for it*; it's no defence at all to point to it being the bottom of the RTX range to justify those features being useless - they could have easily made that the 2070 the bottom-end RTX card and kept the 2060 as a GTX chip, but they didn't, and that makes it fair game for a critique.

As for the criticisms about the 3000 series not running future RT games, people seem to be deliberately missing the point here when they talk about their own habits of changing hard every 2-3 generations. The fact is that you can still run AAA rasterized games with modern features perfectly acceptably on an R9 290X from 2013, but I very much doubt you'll be able to run RT titles from 2027 acceptably on *any* 3000 series card. For context, the 3080 has an equivalent MSRP to the R9 290X when you account for inflation. I expected that situation with the beta-test 2000 series, but the 3000 series really did not move the needle much at all on Rt performance and that really sucks.

Nvidia are selling top-end cars for top-end money, and insisting that reviewers talk about the "future" of RT, but these cards simply will not be a meaningful part of that future. These are just the facts.
Posted by MachineShedFred
 - December 14, 2020, 15:12:42
"The 30-series add-in-boards are crap because my last-gen mobile GPU can't do it."

That's what I read from this article.  Nice logic.

Hey author, you realize there's a reason that these guys are strapping those huge heat sinks onto the sides of those GPUs, which your laptop doesn't have, right?  And that mobile GPUs have completely different design constraints than what they put into desktops?  Probably because in a desktop the GPU can use 10x the power that your entire laptop does when running full throttle?  And the primary purpose for that energy isn't to create heat for a fan to blow out the back - it's doing work with it.

And quite possibly a 1st gen mobile implementation of a state-of-the-art technology, where power consumption is a primary concern so that the laptop's battery isn't run flat in 15 minutes while scorching the owner's thighs isn't exactly the highest performer in that technology.

Also, drivers and game developers never improve their implementations with time and more wide use in the public.  Nope, the same half-assed early implementations are the only way to ever do it, apparently.  Why would a software engineering manager spend his team's valuable time on a flawless implementation of a early-life feature that 1% of GPUs support properly, with drivers that are only half-baked?  When that number becomes 10% and the driver matures, the results change.  We've seen this over and over in the GPU space for 20 years.

What a piece of trash this article is.  Completely foolish assertion with unbelievably boorish conclusions drawn from it.
Posted by Andy R
 - December 14, 2020, 14:49:01
@ Todd H
You say you care about a smooth experience but then say sub 60fps is fine... I'd personally say anything below 60fps is poor. I'm guessing this is where your love of 4k is coming from.

The reason I and a lot of other people prefer 1440p is not due budget requirements at all but due to a preference for smoothness (144hz/144 fps) and graphical detail over resolution, as you've stated even with the 3090 you can't have both in all games. I could have saved myself a pretty penny when I got my 2k 144hz monitor and gone for a 4k 60hz instead. But I looked at the difference between the two (and tested two different 4k 60hz monitors) and it was pointless me having a 27-30" 4k monitor as there is no noticeable difference when you're talking those sizes as far as resolution goes, but there is very noticeable difference between any sort of action game (not just First person shooters) running at 60hz vs 144hz. Fair enough if we're talking 50" and up TV's then yes there would be a point in 4k, but I personally have never used my TV for PC gaming and never would. Also anything above 30" is just too big for desk use IMO.

Also you seem to contradict yourself you say FPS don't matter then say "Resident Evil, RPGs, adventure games, fighting games, sports games need a smooth experience and not an immersion breaking stuttering" one of the main and most obvious causes of stuttering gameplay is having low FPS so I'm not sure what you're trying to say there.

Don't get me wrong I would love to have 160hz 8k with all settings maxed out but the reality for now is that for now 144hz 2k maxed out is about the best we can hope for, or with your choices 60hz 4k maxed out.