Bilibili leaker Enthusiast Citizen states that the AMD Radeon RX 7000 graphics cards will go on sale in December. AMD will launch only two flagship SKUs at its November 3 event, neither of which can outperform the GeForce RTX 4090 in rasterization or raytracing. https://www.notebookcheck.net/Underwhelming-AMD-Radeon-RX-7000-performance-estimate-renders-it-incapable-of-competing-with-Ada-Lovelace-in-rasterization-and-raytracing.661709.0.html
"users on a budget, who will have no choice but to buy the GeForce RTX 4080 16 GB and the GeForce RTX 4080 12 GB"
Do not spread manufacturers' PR that either would be a "budget" card"! Quite contrarily.
I thought $300-$500 was a budget card, not $900
The GPU Market is out of control. 450 W power consumption and the price tags are insane. A user on a budget will buy none at all and switch to xbox or playstation, maybe even a steam deck.
Quote from: Time on October 13, 2022, 08:33:34I thought $300-$500 was a budget card
In 2000, I bought a budget graphics card for less than DM 50, which would have been about €25. You have been corrupted to call $300 budget level.
Nowadays, budget is an iGPU, such as 680M in 6800U.
Im not expecting this card to beat the 4090 but have a better price
Anil Gandu is incapable of writing unbiased, unprejudiced articles whenever it comes to AMD. He does it every time. Also look at his absurd comment 'gamers on a budget are forced to by the RTX4080'.
Since when has the X80 gfx cards been a budget range?.
I thought it was made very clear that the high-end RX7000 are not trying to compete with the 4090.
This article is NOT fact based and this person has no record showing past accuracy in his prognostications. I firmly believe he is a Nvidia troll with malicious intent. It was irresponsible for this byllcrap to be published
Green boy for sure! So much time wasted reading factless articles these days.
Total bs article. This oaf wouldn't be able to kick a cone if it was right in front of him. Baseless junk that's written nowadays.
But...so few games use Ray tracing, and those that do don't look that much better really. So WHY would the RX 7000 be hobbled by not being 'quite as good' as Nvida's Ada Lovelace? This just seems like so much nothing. I have no doubt that Ray Tracing is teh future, but we are still three to five years away form ist being a necessity. Moreover, the "quibbling difference' between AMD's and NVIDA's current technology for rasterization is not that pronounced.
What an absolutely bs article. I expect better than people being paid by Nvidia to push articles like this.
"The obnoxiously long wait time will be painful for users on a budget, who will have no choice but to buy the GeForce RTX 4080 16 GB and the GeForce RTX 4080 12 GB in November."
LOL, what? People on a budget buying 4080s? NOONE ON A BUDGET IS BUYING CURRENT GEN CARDS.
I'm on a budget, you know what I have? A 2080Super. And I'm not looking to upgrade any time soon.
A bit skeptical about this information with no evidence. AMD aiming for 400w or so for their highest end part (apparently). >50% perf/watt gen over gen, straight from AMD. Napkin math alone tells you they should be able to double the 6900 xt's raster perf, maybe even double the 6950 xt's. If they decide not to hobble their cards with a garbage bandwidth and keep the infinity cache relatively high, it should make up for the lacklustre 4k performance a bit. Yeah, the 4090 is a beast. It's also 1600$ lol. AMD will have stupidly good yields with MCM, so they could cut nvidias monolithic, overpriced legs beneath them.
QuoteBut...so few games use Ray tracing, and those that do don't look that much better really. So WHY would the RX 7000 be hobbled by not being 'quite as good' as Nvida's Ada Lovelace?
It is not only "gaming" that matter. 3D Render engines use it. Tensor cores and Cuda cores are very important for video editing and FX.
450w is same as 3090 TI and the 4090 is much more energy efficient.
Put a 4090 rendering a scene and see what card consumes less to get same result: the 4090 at the moment.
Quote from: RobertJasiek on October 13, 2022, 07:27:35"users on a budget, who will have no choice but to buy the GeForce RTX 4080 16 GB and the GeForce RTX 4080 12 GB"
Do not spread manufacturers' PR that either would be a "budget" card"! Quite contrarily.
I was about to bring up this comment too..users on a budget have tons of other options.. including not buying one right now lol. You can live without one, just like yesterday. Debating the 4080 but will wait and see what amd does and if it affects prices
Quote from: yamiimax on October 13, 2022, 10:14:53Im not expecting this card to beat the 4090 but have a better price
That's what all of us budget and mid range buyers arw hoping. Only the idiots that paid over 2k for a 3090 to fart on 4k@120 on medium/high settings think budget was a $800 3070 non ti. Even these new 4080's with low memory are over priced, it should have been a 3070 at the 600 dollar range. If I were to switch to the green team, it would not be the first version of the 4000 series. The new flagship AMD cards will be able to run 4k@120, just likely 10% or less slower. As long as it gets to the 120fps, with a lower price and power requirements.
Only people waiting to grab a 4090 at these prices are the bots lol.
Give me the lower priced card that can do the same thing at 8-12% less performance and 200w. Ray tracing is nice, but its mostly greatly for for single player games, online FPS players don't want nothing taking away frames.
Quote from: Time on October 13, 2022, 08:33:34I thought $300-$500 was a budget card, not $900
Budget is whatever you want it to be, it's just that a budget which you want to spend on smth, 3000€ is also a budget. However a low-mid range should be 300-500€
This article is unhinged and feels like it was written by NVidia.
Yeah if the competition (NVIDIA) is so overpriced as to make me physically repulsed from even considering buying a Ampère or Lovelace based GPU even if I could afford it. I'd rather light my spliff with banknotes then spend then on a company with such a disgusting consumer exploiting mindset. Selling pallets of limited edition GPUs straight to large scale crypto mining outfits. I'm a small scale miner myself (always buying second or third hand GPUs one at a time, also consumer electricity prices are quite expensive here in the Netherlands, so my profits were always marginal) do I'm sympathetic to people who are just getting by and buying at most 2 or 3 GPUs in a year.
The is enough for everyone's need, but not for everyone's greed. Peace!
Nvidia has a wonderful product and creates cool resources to aid game developers. The unfair treatment of board partners; the unethical, intentionally misleading name of the 4080 12gb; market manipulation; and sheer laziness (or negligence) that resulted in the ticking-time-bomb-power-connectors — this, this is why I am buying AMD
Love how triggered the amd bots get whenever they cannot accept the fact of receiving inferior products from their beloved company.
Even today there isn't a product from either mfg that can match Pascal's fps/£.
Moreover the number of new games that demand performance greater than a 1080 is shrinking every year.
Consoles have stagnated too, with ever longer generational cycles, and previous generation staying relevant & supported for ever longer.
The only significant progress in recent x86 tech was Zen, and the response it elicited from Intel, and even that appears to have petered out.
Where's it all going to end? :/
People say "oh no, AMD isn't beating Nvidia's top card" every generation, as if it's a suprise. AMD doesn't even try. The closest attempt since, what, HD2000 has been the R9 290X, but other than that, AMD hasn't made any effort and instead spends their development dollars on other segments.
Quote from: ThatDudeThere on October 16, 2022, 07:14:51Love how triggered the amd bots get whenever they cannot accept the fact of receiving inferior products from their beloved company.
Lad,the 6950xt traded blows with the 3090ti when it came to raw performance, and the 5800x3d is still overall better than 13th Gen Intel. So idk how you came to this conclusion.