News:

Willkommen im Notebookcheck.com Forum! Hier können sie über alle unsere Artikel und allgemein über Notebook relevante Dinge disuktieren. Viel Spass!

Main Menu

Post reply

The message has the following error or errors that must be corrected before continuing:
Warning: this topic has not been posted in for at least 120 days.
Unless you're sure you want to reply, please consider starting a new topic.
Other options
Verification:
Please leave this box empty:

Shortcuts: ALT+S post or ALT+P preview

Topic summary

Posted by Jimbo
 - March 23, 2023, 17:48:20
This "article" is literally just paraphrasing a PR piece. The only thing worse than a PR article is a lazy reprint of it with no disclaimer. They are asserting that a 7900xtx and $600 worth of pc upgrades would game better than a 4090? Absurd, especially with cpu prices so competitive. I got my 13600k for $250 and it won't bottleneck my card at 4k so what would be the point of spending more on a cpu? 1440p is a little different but I'm fairly sure that my combo will handily beat a 7900xtx with a 7950x3d at anything above 1080p, RT or not. They simply cannot keep up, period.
Posted by Mo Ra
 - March 22, 2023, 20:31:58
They are around 22 % behind the RTX 4090. A competitor would need to be within around 5%. If they had the originally claimed 54% performance per watt, they would have been within that ballpark.

Also, they could have added around 65mm2 or so to the GCD die and that would have gotten them to RTX 4090 levels without adding more than a couple of hundred dollars to the price.
Posted by DrStrange
 - March 20, 2023, 08:35:01
It wouldn't have had to be a 600w part.  Many 4090s are 450w and do just fine.

AMD just couldn't compete in that tier without it being commercially unviable.  It is as simple as business gets
Posted by Anonymousssss
 - March 18, 2023, 00:30:53
Another point about the drivers I want to clarify, I currently have a nvidia gpu and have used many recent AMD GPUs. The experience with Nvidia drivers and windows is horrendous. And seems to be worse than anything I've used with my 3 amd gpus, I understand that just me though so I thought I would put that out there.
Posted by Anonymousssss
 - March 18, 2023, 00:30:07
Another point about the drivers I want to clarify, I currently have a nvidia gpu and have used many recent AMD GPUs. The experience with Nvidia drivers and windows is horrendous. And seems to be worse than anything I've used with my 3 amd gpus, I understand that just me though so I thought I would put that out there.
Posted by Anonymousssss
 - March 18, 2023, 00:26:21
Quote from: sirsquishy on March 13, 2023, 00:09:04This article is bullshit. AMD could have built a 600W 5800XT as well. They didn't because the technology stack at the VERY top end was not in their favor then, as it is not in their favor today. Would you buy a 1600USD RX7990XT that still pulled under the RTX4090 in something like native RT? Lacks DLSS(FSR2.1 is amazing, but still..). I sure as hell wouldn't. This is why AMD did not do this. RAST performance AMD is on top due to ROP and dual-SP's now, but RT they are 2-3 gens behind still without FSR. 
I don't think they are 2-3 gens behind on RT, 3 generations is all the Generations of Nvidia RT GPUs,this is like saying they are worse than the first generation Nvidia gpus which is not true. The performance is about 1-1.5 generations behind. Nvidia quickly jumped in performance per generation for RT and so is AMD. The other thing is that RT is simply not used very much. With FSR it really doesn't matter what the native RT performance is. I mean of course it needs to be halfway decent but it doesn't need to be as good as Nvidia. Its a waste of die space, especially for lower end cards and FSR more than makes up for its shortcomings.
Although they do usually play catch-up when it comes to software features like FSR, the driver software is as stable as Nvidia's, though both could be better, its software to control and monitor the gpu is far better and so is recording software.
The reason for slower software roll-out is because of the open-source nature. If you are nvidia and take the closed source approach, you don't have to worry about making the software accessable or understandable to the entire community. This process that AMD has to go through significantly slows them down and in addition to that they are just a smaller company. But what they do for the open source community is important and is a great thing.
While I do agree that they probably can't release a 1600$ 4090 competitive gpu, I think that what they do make is pretty damn good. At least there is some other company making GPUs aside from nvidia. I understand their pricing might not be good by any means and whatever reasons it is priced that way, however it is cheaper and will likely stay cheaper than the competition. 
Posted by Alrx
 - March 14, 2023, 06:36:43
What a load of garbage, if they could make one they would. There is no reason to hold back at all why would they hand Nvidia that market segment? 4090s have sold relatively well because there are gamers who won't compromise on performance and there are creators who don't want to spend Quadro money.

Simple fact, they can't do it or at least not in a commercially viable way.
Posted by man_daddio
 - March 14, 2023, 05:06:45
Oh such a virtuous AMD looking after us little people. Who makes this BS up? And if they did say it they are BSing. If they made something to actually compete with a 4090 it would have been a climate disaster because they are all in on the climate doomsday BS. They are already less efficient than Nvidia. Imagine a 4090 competitive product. It would burn your hands just idling.
Posted by Anonymousgg
 - March 13, 2023, 22:00:35
If they had hit their more positive internal estimates of performance, they might have gotten closer to the 4090. Instead they had driver hell and smartly pulled back to 4080 level and undercut it on pricing.

AMD can easily double the Infinity Cache and add faster memory to the 7900 XTX. There's no reason to do it if their drivers aren't ready. They can release a 7950 XTX with a price of $1000-1200 later this year, after 7900 XTX drops in price.
Posted by Dante-Inf
 - March 13, 2023, 14:13:56
$1600 is not mainstream and $1000 is? What a bunch of horse@#$%. Competing at 4090 level was too much for AMD.
Posted by Jeff Kap.
 - March 13, 2023, 12:17:33
All nonsense from AMD.

Their RX 6900 series can compete against the 3090 series, and suddenly, the next successor, RX 7900 series can decided to 'oh, let's compete against the 4080 series instead of 4090'.

Like someone said, AMD is going tier-to-tier comparison, no longer pricing.

RX 7900 series - RTX 4090 series
RX 7800 series - RTX 4080 series
RX 7700 series - RTX 4070 series
RX 7600 series - RTX 4060 series
RX 7500 series - RTX 4050 series
Posted by Shane
 - March 13, 2023, 06:57:17
Yeah... I call BS on this.

The whole article smells and sounds like elitist copium. Claiming they had such good will for not releasing such a powerful and expensive card because they cared about their consumers?

Does AMD think their fan base will just go out and buy a $1600 card and cheap out everywhere else? Do they think their fans are that stupid? Especially considering if they had the use case for such a high end enthusiast card, why not the 4090 which is better at many different levels; DLSS 3.0, RTX, AI use etc.

I don't see why they couldn't simply just build such a card, even at low numbers for true enthusiasts or tech reviewers that could make an equal and fair comparison between the two cards. Except ofcourse the 4090 is actually used for useful workloads.
Posted by Bennyg1
 - March 13, 2023, 03:49:44
Smells like post hoc excuse making. I think they didn't want to try, and fail. It's not like they have won many hearts and minds by not pricing their CPUs and GPUs (especially the 7900XT) only slightly worse than their competition and doing the exact same high-end-first trickle-down release strategy.
Posted by Mr Majestyk
 - March 13, 2023, 02:54:02
Big deal, 99.99% of people wouldn't buy one anyone. The market is screaming out for cards under $500. The fact that both manufacturers start with the high end shows the contempt they have for us. Yeah yeah they are trying to offload old cards but they created that problem by price gouging during the crypto boom and again we are being treated like morons and expected to pay $800+.
Posted by PeterO
 - March 13, 2023, 01:25:49
If I was AMD, I too would hold back from developing a 600 watt part to compete with a 450 watt part when I suck at the only 3 valid use cases for such a part today (VR, Ray Tracing, and GPGPU compute).

Especially given the already inferior cooling performance of my existing products.  (Although the more forgiving form factor redeems this.)

It's one thing to want to be present in every market sector, but it's another to focus on not printing e-waste.

Ironically enough, nVidia is struggling with the same dilemma for everything below a 4080.  Even if you want to go team green, there's a heap of cheap used 30 series that can  hit those same performance targets.