Quote from: sure on January 06, 2026, 11:00:50I wasn't informed what % to expect so I looked around: [In Cyberounk] The first few vids on YouTube show the 7800X3D to be 34.5-37.3% faster than the 285K, another text-based source shows 40%, but in a more recent benchmark, the 7800X3D is on par with the 285K and the 9950X3D is just slightly faster*. So something is definitely up, indeed, maybe check the settings again.Unfortunately I had to send back the XMG systems, so cannot retest. I suppose we can consider these two cases as outliers. The other 6 gaming tests should suffice to prove the Intel CPU has indeed mostly caught up with the AMD competition thanks to the microcode updates.
QuoteYa, guess I switched the digits by mistake. Thanks for spotting this.I see, good that it's fixed.
QuoteThe Win 11 operating systems were fresh. I think some game settings may have gone awry, although I did test several times for each system and the graphs show only the highest recorded FPS counts.I wasn't informed what % to expect so I looked around: [In Cyberounk] The first few vids on YouTube show the 7800X3D to be 34.5-37.3% faster than the 285K, another text-based source shows 40%, but in a more recent benchmark, the 7800X3D is on par with the 285K and the 9950X3D is just slightly faster*. So something is definitely up, indeed, maybe check the settings again.
Quote from: sure on January 05, 2026, 22:34:43Thanks, I see now they contain an average value. An added % value in the sentence would still be more helpful (in the graphs one has to search).Ya, guess I switched the digits by mistake. Thanks for spotting this.
Using the middle/average values: 193.4-127.7 = 65.7 Watt?
So, one could also say that the AMD V-Cache CPU consumes 34% less energy, or only 66% of what the INTEL CPU consumes, or that the INTEL CPU consumes 51% more energy.
Quote from: sure on January 05, 2026, 22:34:43I see. I read that for AMD some cores might not be utilized and a fresh OS install would fix it, maybe it needs to be done for the INTEL CPU as well. Just an idea. Can't imagine that the INTEL top CPU is only half as fast in a GPU-bound 4K setting in a relatively modern game, something must be up.
QuoteYou can see the exact wattage in the power consumption graph images at the end.Thanks, I see now they contain an average value. An added % value in the sentence would still be more helpful (in the graphs one has to search).
QuoteI tested only at 4K, but these two cases are indeed very curious, couldn't find an explanation.I see. I read that for AMD some cores might not be utilized and a fresh OS install would fix it, maybe it needs to be done for the INTEL CPU as well. Just an idea. Can't imagine that the INTEL top CPU is only half as fast in a GPU-bound 4K setting in a relatively modern game, something must be up.
Quote from: sure on January 05, 2026, 15:05:13Notation in % would be actually helpful or say how much the other CPU consumed.You can see the exact wattage in the power consumption graph images at the end.
Quote from: sure on January 05, 2026, 15:05:13Why is Cyberpunk 2077 on the AMD CPU almost 100% faster, but then also why is AW2 on the INTEL CPU 56.85% faster? Aren't these 2 games mostly GPU bound: At what resolution did u test?I tested only at 4K, but these two cases are indeed very curious, couldn't find an explanation.
QuoteThus, in idle and single core scenarios, and even in most gaming tests where the GPU is doing most of the work, the Intel CPU consumes 20-30 W less on average.This's nice. Supposedly AMD's IO-chip is the reason for it, as it's made on a worse node. Probably also transferring data back and forth to it is also a reason, as moving data is a major power consumption thing.
QuoteIn Prime 95, the AMD CPU consumes up to 56 less W compared to the Intel one .. These extreme cases are not very likely to happen in real world scenarios, though.Notation in % would be actually helpful or say how much the other CPU consumed.