News:

Willkommen im Notebookcheck.com Forum! Hier können sie über alle unsere Artikel und allgemein über Notebook relevante Dinge disuktieren. Viel Spass!

Main Menu

I believe in ray tracing, but I do not believe in Nvidia's RTX 3000-series GPUs

Started by Redaktion, December 13, 2020, 04:46:19

Previous topic - Next topic

llMithrandirll

Haha, comparing the full lineup of current generation desktop GPUs to the lowest end mobile version of last generation GPUs is just stupid. Plus you seem to have completely ignored DLSS 2.0 which is what Nvidia created so you can run games with RTX on.

Also I'm sure that over the lifespan of ampere we'll see many optimizations to the drivers  and the accompanying software like DLSS which will dramatically improve performance just like last generation. Remember how BFV wouldn't run on a 2080ti when it came out but now it runs fine on the same card?

Essentially I'm just saying there's a ton of stuff you've ignored in your 'analysis' of the 3000 series GPUs and a ton of stuff that's likely coming down the pipe that nobody outside Nvidia knows about.

Finally my last point is that the 3000 series has come just at the time that real time ray tracing is becoming mainstream. If the foundations (I consider the ampere cards to be the final pieces of the RTX foundation) of a new technological norm doesn't have an impact on the future of that technology I don't know what does.

Valaska

Quote from: light on December 13, 2020, 12:16:41
Yeah, all these nvidia fanboys missing the point here. Raytracing is the future, but by the time raytracing becomes the present, ampere will be too weak to run it.

I mean, not even a 3090, the most powerful ampere GPU, is capable of running Cyberpunk at high framerates even with DLSS. I'd guess that it'll probably take another 3-5 years for RT to truly realize, and at that point your 3090 is gonna be doing, what, 20-30 FPS with DLSS?

The only thing ampere cards will ever be good for in terms of raytracing will be gorgeous cinematic titles where you're okay with choppy cinematic framerates.

Yeah and a VooDoo3 wasn't capable of running MechWarrior4 or Crysis. What's your point? Eventually all GPus hit a performance wall. This is a nothing burger article meant to stroke the ego of Linux in the hopes of a hangout from the multi millionaires who whine about not getting enough free hardware.

Us game developers work our asses to the bone and we have to pay for this stuff with maybe at best a small percent off unless you are one of the big names then they can get 20% off. And guess what, you need to update before old GPUs eventually become outmoded...

Who knew that eventually GPUs and hardware gets to old... Oh wait, anyone with common sense. The 3000 series DOUBLED DXR performance over the 2000, what more can you ask? Do you not realize how much of a leap that is?

isekai king

As someone who owns an EVGA 3090 ftw3 ultra, I genuinely feel like I lost major brain cells reading this article. The dude who wrote it clearly has no idea what he is talking about and this article is nothing but clickbait. The guy who wrote it clearly states he only owns a 2060 mobile (do y'all mind if I puke) and doesn't even own a 30 series GPU. Talk about lack of credibility. That is incredibly laughable.

Personally I have been playing cyberpunk 2077 in 2k 60fps at maxed settings including all ray tracing set to psycho settings. I have been getting 60fps constantly and it does not drop below that at all.

Sincerely, me laughing at how pathetic this article is. ;)

Joseph Stalling

I have not seen any games that I must have ray tracing to find it enjoyable. The author is right, ray tracing may be the future, but presently there is little to definitively showcase its full abilities, both hardware and software.

Bobby C

Excellent article but I must point out that I don't think your mobile 2060 has VRAM. I have a 2060 KO on my desktop and with RTX on in COD MW the frame rate drops from 90 fps at 1440p with average settings to high 40's with RTX on but the game does look amazing with it on. I could only imagine that with a 3090 that it would work at 90 fps with RTX on but that is a guess at this point. I am unsure about COD BOCW as I have yet to buy it because of Activision's control of the the Playlist's and maps in MW and I decided that if I can't have Shoothouse 24/7 and was forced to play multiplayer maps I don't like that I wouldn't buy the new game.

Andal1

The article seems less incorrect that it is unnecessary. Of course the RTX 3000 series isn't the future, just like my geforce 2 ti, 5600u, 8800gts, gtx 260, gtx 580, 7970m,  gtx 965m sli, and titan v weren't the future. The titan v was my most future proof gpu ever. 3 years and it still kills at rasterization. It's the same story every time something new comes out - "Oh, you don't need a DirectX 10 card because by the time it's used, your card will be too slow to use it." Yes, that's true. It's also a waste of time to point out the obvious.

Bronal

Yawn. 3000 series are king and will only be dethroned by one of it's own successors.  The author of this piece is probably new to PC gaming.

Rob maccy

I feel bad for all of you trying to rationalize why a card you don't own must be bad. Sorry but my 3080 was an incredible grab and the difference is mind blowing.
I'm glad I used the evga step-up program to my card in a reasonable time frame too. Waited 53 days before my spot came up in queue and I got to use my 2070s while I waited.
Zero regrets, ray tracing looks incredible, loving 2077

zbig

The 3000 series as well as the new AMD cards offer the largest generational leap in raw performance as well as performance per dollar than we've seen for at least a decade. There's literally no reason to bash these cards unless you're bitter that you can't get ahold of one or you're a fanboy for one or the other team.

Gabe

The whole point of ray tracing is in how impractical it is. Ray tracing is Nvidia's 10 year plan around why you'll continue to need GPUs instead of just incorporating that feature into CPUs or onto the motherboard -- raytracing does this.

Maybe 5 years from first RTX launch, cards will be fast enough that turning on ray tracing makes any sense at all. 5 years after that, and ray tracing will still be so resource intensive that it makes sense to buy a GPU -- it still won't make sense to fold that feature into integrated graphics units on CPUs. 5 years after that, and Nvidia had better be releasing the next big performance hog, or risk becoming irrelevant like sound blaster.

So if you're saying "wow, ray tracing is great but there isn't any silicon out there capable of doing it justice" -- that's the F'ing point!

Joyenergiser

Hilarious, you start the article off about yet another storm In a teacup about free review cards, where all tech journalist stand together  on their soap box against the evil corporate injustice! (yawn) and then proceed to trash the 3000 series. What a waste of time reading this.

Jonathon Schott

So here is my two cents after reading the comments. The writer of this article is exaggerating a moot point. I do not expect the 30 series to run rtx games the same in 4 years and anyone making that statement is ignoring the fact that they themselves did not expect the 700 series to run a game like The Witcher 3 as good as a 10 series card. It is the nature of technology. That is the reason I want a 30 series card because my 10 series gpu is starting to show its age, as well as it has served me. What this article inflates unnecessarily is the point Hardware unboxed alreday made is that, pound for pound, the rtx performance of a 30 series card is EXACTLY THE SAME as a 20 series card when you normalize for the uplift in rasterization performance. The design choices made by Nvidia this generation are rubbish. Yes, the rt cores perform twice as fast, but they only included half as many. Nvidia this generation is being overly bullish and going for mindshare grab over quality products because AMD actually has something to compete with for once in a great while. Are they bringing ray tracing to the masses? Absolutely, by giving us a card that is moderate in price and performance like the 3060ti. The author's comparison with his 2060 mobile, probably max-q to boot, is hilariously skewed, no $#!+ sherlock your experience is sub-par. Nvidia should have maybe not even released that product, but I don't know what to tell you if you were expecting something out of it, i didn't and that is why I don't own one, and if I had purchased a laptop with one would not have expected much in the way of rt performance, would be happy with the higher performance than a 1660ti mobile, and viewed rt as more of a tech demo. So don't get mad about your own bad purchasing choices, and at least call the situation for what it is. Nvidia has become greedy. If they wanted to push ray tracing the way they say they would have taken a hit on rasterization to add more rt cores and equalize rt vs non rt performance, but they allowed themselves to get caught in a 'who has a bigger rasterized penis' game with the consumer and AMD and in this case we the consumer have lost. They say that they care about the proliferation of AI but yet the same die in the A6000 has twice the tensor cores as a 3090, which also does not get titan drivers btw, forcing the consumer to either deal with geforce game drivers not suited for that type of workload or spend $3000 plus on a workstation card, and if its the same die that means they are turned off in software, they are a part of the sm complex this gen. It's this whole 'Nvidia is a company that cares' bs that has me irked.
That email they sent to hardware unboxed was total garbage, this article is total garbage, most of the comments are uninformed garbage (to no ones fault), this year has been total garbage, and I've been told that whats in between my ears is total garbage. Screw this, I'm opening up a recycling center.

Todd H.

Here's my perspective. I only care about a smooth experience. No freezes and no jittery performance. Sub-60 FPS is totally fine for most titles. If you play online shooters, you don't need raytracing or even maxed settings. FPS doesn't matter to me even less because I don't play online shooters. I do think a game using old technology should have high FPS on modern graphics cards. But some games are poorly coded and will always run like s***.

GAMERS are spoiled and don't think logically. Yes, these companies can hype a product a little too much but Nvidia never said 120fps raytracing performance or even 60 fps locked.  Raytracing is for immersion. Why do you need so many FPS to enjoy a game world? Did we care about FPS on the first Playstation? Do we get upset that our 2008 Honda gets worse gas mileage than a 2020 model? How about TVs? TVs introduce new technology but may not be great at first. We aren't hard on TV companies.

I get 50 to 55 fps in Cyberpunk. Everything maxed, DLSS quality. This is awesome for such a demanding title and can't wait for future titles with better implementation. But I also have a 3090. If you buy anything in life, the most expensive is usually always better. More comfortable. Faster. Better on the eye. These graphics cards allow us to witness something revolutionary in gaming. If gamers weren't hung up on FPS, they'd sleep better at night. You can enjoy great looking games at a lower FPS knowing it can only improve but it takes time, or you can whine about FPS which could have a ripple effect in the industry that may cause raytracing not to embraced.  I remember when there were arguments about polygons vs sprites.  People made fun of virtual fighter because of the bland looking characters and backgrounds.

GAME DEVELOPERS have to embrace and utilize hardware more efficiently. Why does my expensive GPU and CPU struggle with flight simulator? For one, it uses DX11. I feel like these developers are lazy. Games should perform faster on more cores instead of solely relying on GHZ. Do you know who benefits from single thread high frequency in gaming?  Intel.  If gamers made a stink about developers not caring about cores, we would have games that run much faster.  Do you know why SLI died? Because developers were lazy and the technology wasn't embraced by them. Also, gamers complained about the cost of two graphics card.  But imagine if you could plop in another 1060 for cheap instead of getting a newer, more expensive graphics card. You have to think long term. Game developers can make or break emerging tech if they are lazy, can't see the benefit or cater to certain companies. AMD did not invest in any new technology to make games look even better. Nvidia did. If it was just AMD, light and shadows would look the same for the next 10 years. DLSS and ray tracing are forward thinking to benefit gamers. Give Nvidia credit.

Seriously.  All this anger is about FPS. Whereas first person shooters (especially competitive online gaming) benefits the most from high FPS.  Resident Evil, RPGs, adventure games, fighting games, sports games need a smooth experience and not an immersion breaking stuttering. High FPS in these games will not make you beat a boss quicker.

I also see a trend with gamers rejecting 4K (1440p is enough for me says John) and/or HDR.  ONLY because to experience 4K and HDR, it takes money.   But when these technologies come down in price, wouldn't it be great to have a huge catalog of older games that look great in 4K, HDR, and ray tracing?

Think ahead.  Yes, not everybody's bank account is the same.  But two things are true in tech.  It takes time for things to improve and it takes money to experience the best.  But those who spend the cash on new emerging technology are funding the R&D to bring this technology to the masses.

These tech reviewers should be better influencers and tell everyone that FPS doesn't matter in most games. They should focus more on smoothness and if the game has bugs or other problems. We are FPS whores because that's all we hear in these damn benchmarks and reviews.  If we go back to wanting games to look great (remember Super Mario 64?  Did we have Youtube to complain about the FPS?) and not care about how many frames, things would be so much better in this industry.  But game developers have to cut back on graphics because they have to appease the masses who only care about FPS.

This is all my opinion.

Cid

Lol! Are you serious? I suppose you still use your 3Dfx Voodoo card then? Each piece of hardware is a step, not the end. The cost for the 3000 series is less than the 2000 series. A step in the right direction. Just like the with the introduction of 'Normal Maps', the hardware isn't capable of using it Ray tracing everywhere...yet. To argue against progress, because it's insufficient for you, is assinine. As both a hardware, and game developer, I find your post somewhere between comical and pathetic.

Turtleman

Judging ray tracing when you are using a laptop 2060m is completely unfair. You talk about taking FPS, basically your user experience scope is sooo tiny, the 2060 is the entry level crap card of the 2000 series, and the 2000 series overall was underwhelming. This article is clickbait garbage. Play those games on a 3070 or better and then talk about the future of the 3000 series

Quick Reply

Warning: this topic has not been posted in for at least 120 days.
Unless you're sure you want to reply, please consider starting a new topic.

Name:
Email:
Verification:
Please leave this box empty:

Shortcuts: ALT+S post or ALT+P preview