News:

Willkommen im Notebookcheck.com Forum! Hier können Sie über alle unsere Artikel und allgemein über notebookrelevante Dinge diskutieren. Viel Spass!

Main Menu

Intel Core Ultra 9 285K VS AMD Ryzen 9 9950X3D

Started by Redaktion, January 05, 2026, 14:22:41

Previous topic - Next topic

Redaktion

Thanks to the latest microcode updates, Intel's Core Ultra 9 285K closes much of the performance gap to AMD's Ryzen 9 9950X3D, delivering near-parity in many workloads. Differences now fall to single-digit margins, turning this matchup into a genuine two-way contest at the top end of the desktop CPU market.

https://www.notebookcheck.net/Intel-Core-Ultra-9-285K-VS-AMD-Ryzen-9-9950X3D.1186330.0.html

sure

QuoteThus, in idle and single core scenarios, and even in most gaming tests where the GPU is doing most of the work, the Intel CPU consumes 20-30 W less on average.
This's nice. Supposedly AMD's IO-chip is the reason for it, as it's made on a worse node. Probably also transferring data back and forth to it is also a reason, as moving data is a major power consumption thing.

QuoteIn Prime 95, the AMD CPU consumes up to 56 less W compared to the Intel one .. These extreme cases are not very likely to happen in real world scenarios, though.
Notation in % would be actually helpful or say how much the other CPU consumed.

Sure, as the stress on the CPUs grows, AMD' power efficiency probably grows exponentially vs the INTEL one (plenty of AMD vs INTEL power efficiency tests from e.g. Hardware Unboxed, Gamers Nexus, .. exist) (INTEL sometimes consumes like close to 4 times as much energy!, although, if I remember correctly, it was for the previous gen).

Why is Cyberpunk 2077 on the AMD CPU almost 100% faster, but then also why is AW2 on the INTEL CPU 56.85% faster? Aren't these 2 games mostly GPU bound: At what resolution did u test?

Bogdan Solca

Quote from: sure on January 05, 2026, 15:05:13Notation in % would be actually helpful or say how much the other CPU consumed.
You can see the exact wattage in the power consumption graph images at the end.

Quote from: sure on January 05, 2026, 15:05:13Why is Cyberpunk 2077 on the AMD CPU almost 100% faster, but then also why is AW2 on the INTEL CPU 56.85% faster? Aren't these 2 games mostly GPU bound: At what resolution did u test?
I tested only at 4K, but these two cases are indeed very curious, couldn't find an explanation.

sure

QuoteYou can see the exact wattage in the power consumption graph images at the end.
Thanks, I see now they contain an average value. An added % value in the sentence would still be more helpful (in the graphs one has to search).
Using the middle/average values: 193.4-127.7 = 65.7 Watt?
So, one could also say that the AMD V-Cache CPU consumes 34% less energy, or only 66% of what the INTEL CPU consumes, or that the INTEL CPU consumes 51% more energy.

QuoteI tested only at 4K, but these two cases are indeed very curious, couldn't find an explanation.
I see. I read that for AMD some cores might not be utilized and a fresh OS install would fix it, maybe it needs to be done for the INTEL CPU as well. Just an idea. Can't imagine that the INTEL top CPU is only half as fast in a GPU-bound 4K setting in a relatively modern game, something must be up.

Bogdan Solca

Quote from: sure on January 05, 2026, 22:34:43Thanks, I see now they contain an average value. An added % value in the sentence would still be more helpful (in the graphs one has to search).
Using the middle/average values: 193.4-127.7 = 65.7 Watt?
So, one could also say that the AMD V-Cache CPU consumes 34% less energy, or only 66% of what the INTEL CPU consumes, or that the INTEL CPU consumes 51% more energy.
Ya, guess I switched the digits by mistake. Thanks for spotting this.

Quote from: sure on January 05, 2026, 22:34:43I see. I read that for AMD some cores might not be utilized and a fresh OS install would fix it, maybe it needs to be done for the INTEL CPU as well. Just an idea. Can't imagine that the INTEL top CPU is only half as fast in a GPU-bound 4K setting in a relatively modern game, something must be up.

The Win 11 operating systems were fresh. I think some game settings may have gone awry, although I did test several times for each system and the graphs show only the highest recorded FPS counts.

sure

QuoteYa, guess I switched the digits by mistake. Thanks for spotting this.
I see, good that it's fixed.

QuoteThe Win 11 operating systems were fresh. I think some game settings may have gone awry, although I did test several times for each system and the graphs show only the highest recorded FPS counts.
I wasn't informed what % to expect so I looked around: [In Cyberounk] The first few vids on YouTube show the 7800X3D to be 34.5-37.3% faster than the 285K, another text-based source shows 40%, but in a more recent benchmark, the 7800X3D is on par with the 285K and the 9950X3D is just slightly faster*. So something is definitely up, indeed, maybe check the settings again.
(*Being just slightly faster is expected, as Zen 5 V-Cache CPUs or any CPUs for that matter, don't bring much of an improvement in higher resolutions in modern, mostly GPU-bound, games, especially in 4K. The highest, most recent, jump is from Zen 3 non-V-Cache to Zen 4 V-Cache in 1440p (and less in 4K).)

Bogdan Solca

Quote from: sure on Yesterday at 11:00:50I wasn't informed what % to expect so I looked around: [In Cyberounk] The first few vids on YouTube show the 7800X3D to be 34.5-37.3% faster than the 285K, another text-based source shows 40%, but in a more recent benchmark, the 7800X3D is on par with the 285K and the 9950X3D is just slightly faster*. So something is definitely up, indeed, maybe check the settings again.
Unfortunately I had to send back the XMG systems, so cannot retest. I suppose we can consider these two cases as outliers. The other 6 gaming tests should suffice to prove the Intel CPU has indeed mostly caught up with the AMD competition thanks to the microcode updates.

Quick Reply

Name:
Email:
Verification:
Please leave this box empty:
Shortcuts: ALT+S post or ALT+P preview