The problematic A0 silicon revision was apparently the reason why AMD delayed the launch of the flagship RDNA 3 GPUs by almost a year. Despite this, AMD still launched the RX 7900 cards with this stepping that is exhibiting high GPU clock variability, power draw issues and disabled shader prefetch instructions.https://www.notebookcheck.net/AMD-s-flagship-RX-7900-XTX-GPUs-discovered-to-integrate-A0-silicon-revision-affected-by-serious-hardware-issues.674766.0.html
You guys are kidding right? AMD releasing a buggy gpu? Come on, this can't be true.
It would not be the first die with major bugs. Recall Pentium division...
Knowingly releasing a buggy product...
Reminds me of Intel's meltdown issue..
But this is worse...
Let's just skip this year's gpus, both nvidia and amd are upto no good.
This is a bunch of nothing.
An Nvidia employee said:
f*** off, they don't know what they are talking about.
Nvidia has been shipping a0 forever
Gpu competition has made it that so you have to make a0 shippable
From doing some additional reading on this, it appears this is not only conjecture at best, and very likely NOT the case, but if it is true, it's not really a big deal, nor something unique to AMD or GPUs. This feels like a very poorly "researched" article, and by that I mean the author read a Twitter post, took it at face value, and regurgitated it without any additional research or insight.
400 amps, somehow I don't think so, more like 400W.
Quote from: vertigo on December 15, 2022, 23:57:30From doing some additional reading on this, it appears this is not only conjecture at best, and very likely NOT the case, but if it is true, it's not really a big deal, nor something unique to AMD or GPUs. This feels like a very poorly "researched" article, and by that I mean the author read a Twitter post, took it at face value, and regurgitated it without any additional research or insight.
Surely all the reviews out there that clearly show clock variations and high power draws are just pure conjecture. The Twitter thread even links to some of those reviews with examples.
Quote from: Riscy on December 16, 2022, 00:14:21400 amps, somehow I don't think so, more like 400W.
Amps multiplied by Volts equals Watts.
They just need to cut £200 off the XTX price.
Quote from: Bogdan Solca on December 16, 2022, 01:12:55Quote from: vertigo on December 15, 2022, 23:57:30From doing some additional reading on this, it appears this is not only conjecture at best, and very likely NOT the case, but if it is true, it's not really a big deal, nor something unique to AMD or GPUs. This feels like a very poorly "researched" article, and by that I mean the author read a Twitter post, took it at face value, and regurgitated it without any additional research or insight.
Surely all the reviews out there that clearly show clock variations and high power draws are just pure conjecture. The Twitter thread even links to some of those reviews with examples.
The two aren't necessarily directly related. I'm not disputing the performance and power draw issues. I'm simply saying that the source of this article, from what I've seen, is
not proof that AMD is shipping A0 retail units, and even if they are, a) that's not necessarily the reason for the issues, and b) even if it is, it's not like people don't know what they're getting, assuming they're smart enough to wait a few days for reviews. Regardless, this article takes questionable info and makes it out to be much worse than it is, or at least that's my understanding from looking elsewhere besides just here and a Twitter post.
There are errata for every processor and GPU ever released.
No new hardware release or revision for that matter, escapes it.
This story isn't even remotely accurate.
Quote from: Riscy on December 16, 2022, 00:14:21400 amps, somehow I don't think so, more like 400W.
Yes, 200 Amps is typical service for a US house, with some older homes having 100 Amp service or less.
If the card consumed 200 Amps, it would promptly explode. It's pretty clear the correct answer is 400 Watts. I'm sure they'll fix this typo.
Quote from: Orangejulius on December 16, 2022, 04:55:08Quote from: Riscy on December 16, 2022, 00:14:21400 amps, somehow I don't think so, more like 400W.
Yes, 200 Amps is typical service for a US house, with some older homes having 100 Amp service or less.
If the card consumed 200 Amps, it would promptly explode. It's pretty clear the correct answer is 400 Watts. I'm sure they'll fix this typo.
You guys need to learn at least a little about electricity before making statements like this. Power (Watts) = Current (Amps) X Voltage (Volts). At 1V, 400W=400A. That does NOT mean it's pulling 200A from the outlet. Power draw is wattage, not amperage, and it's drawing 400A @ 1V which, again, is 400W. A typical circuit is 15 or 20A, which is at 110-120V (US, different elsewhere), so 1725-2300W @115V. This is far above the 400W draw of this card. So no, it's not a typo, and there's nothing for them to fix. It's easy as P=IE.
Quote from: Riscy on December 16, 2022, 00:14:21400 amps, somehow I don't think so, more like 400W.
If power draw is really 1v then 400 watts/ 1 volt= 400 amps. But most likely it's misreporting cause no way a psu can provide 400 amps, this entire article has issues...
Not that any of this matters since scalpers have already bought every new gpu in stock.
Quote from: skeetskeet on December 16, 2022, 13:45:10Quote from: Riscy on December 16, 2022, 00:14:21400 amps, somehow I don't think so, more like 400W.
If power draw is really 1v then 400 watts/ 1 volt= 400 amps. But most likely it's misreporting cause no way a psu can provide 400 amps, this entire article has issues...
You don't look at amps on the draw side and directly correlate with amps on the supply side. You have to use watts for that (which is why, yes, stating watts vs amps in the article would have been better). A PSU supplies power in different voltages, mainly 3.3V, 5V, and 12V. All it has to do is supply e.g. 25A @ 12V (300W), 15A @ 5V (75W), and 7.5A @ 3.3V (24.75W) to provide 400W. The GPU then uses those different voltages in different ways, e.g. the 12V will run the fans and the different voltages will be stepped down to lower voltages for various other things, and when that happens, the amperage increases. If it's stepped down to 1V, then 400W would indeed be 400A. So it's not as crazy as it sounds. That said, it does seem unlikely, since much of the voltage would be kept at 12V for the fans, so to use 400A would require well over 400W. Because of that, I retract my previous statement and agree it likely is a typo. Even if it does somehow work out to 400A and isn't a typo, it's very poor writing to describe it that way, since it's not conventional and it makes it seem much worse.
Quote from: vertigo on December 16, 2022, 16:06:52You don't look at amps on the draw side and directly correlate with amps on the supply side. You have to use watts for that (which is why, yes, stating watts vs amps in the article would have been better). A PSU supplies power in different voltages, mainly 3.3V, 5V, and 12V. All it has to do is supply e.g. 25A @ 12V (300W), 15A @ 5V (75W), and 7.5A @ 3.3V (24.75W) to provide 400W. The GPU then uses those different voltages in different ways, e.g. the 12V will run the fans and the different voltages will be stepped down to lower voltages for various other things, and when that happens, the amperage increases. If it's stepped down to 1V, then 400W would indeed be 400A. So it's not as crazy as it sounds. That said, it does seem unlikely, since much of the voltage would be kept at 12V for the fans, so to use 400A would require well over 400W. Because of that, I retract my previous statement and agree it likely is a typo. Even if it does somehow work out to 400A and isn't a typo, it's very poor writing to describe it that way, since it's not conventional and it makes it seem much worse.
davidbepo mentions that voltage actually never hits 1 V and this seems to be consistent with the reviews reporting 0.1 V in idle state and a maximum of 0.9 in some games but an average of 0.7 V. That's why I said "barely touches 1 V."
Yet another sensationalized and dubious anti-AMD article by notebookcheck. I swear these guys are taking a paycheck from Nvidia.
Quote from: Tingle on December 16, 2022, 01:32:50They just need to cut £200 off the XTX price.
Why? The card is already the best price/performance for raw fps of all time. And the 7900xt is probably #2 (or #3 behind 4090). I think you may be either whining or selfish, to say such a thing.
"Despite voltage being lower than expected (barely touches 1V), the cards somehow draw more than 400 amps."
Someone needs to go back to basic electronics class. The N31 chip itself drawing 400A is no big deal, why do you think the VRM section of today's cards often have a dozen or so 70-90A "power stages"? That doesn't mean the card is drawing 400A from the PSU, as the supply is at 12V not 1V, so more like 30-35A (which is easily supplied by pretty much ANY modern PSU's 12V "rails").
That's also complete BS the launch window for N31 was missed by a year. 6 months BEFORE 6950xt, and barely a year after N21? Come on....
Quote from: Chiccozman on December 15, 2022, 17:01:01You guys are kidding right? AMD releasing a buggy gpu? Come on, this can't be true.
Idiots on the internet taking speculation from idiots that don't know what they're talking about as fact? Come on, this can't be true.
Some Twitter nonsense as always
forums.tomshardware.com/threads/amd-addresses-controversy-rdna-3-shader-pre-fetching-works-fine.3789199/
So, now that AMD has responded to these accusations and multiple places have put these claims to the test only to prove that it's not true are you guys going to edit or change this article in some way or just let it sit around so that people that happen to come across it are incorrectly informed?
Quote from: Tnep on December 21, 2022, 06:38:03So, now that AMD has responded to these accusations and multiple places have put these claims to the test only to prove that it's not true are you guys going to edit or change this article in some way or just let it sit around so that people that happen to come across it are incorrectly informed?
The latter, of course. I hate to say it, but NBC continues to lose more and more credibility and is little more than an echo chamber for other sites' news (and I use that term somewhat loosely) with little actual journalism of their own. I've been coming here regularly for a couple years now, but I'm starting to look elsewhere for tech-related news and reviews.
This is such nonsense
fake article