News:

Willkommen im Notebookcheck.com Forum! Hier können Sie über alle unsere Artikel und allgemein über notebookrelevante Dinge diskutieren. Viel Spass!

Main Menu

Ryzen AI 400 Gorgon Point APUs are here: AMD claims 29% faster multitasking and 12% faster gaming vs Core Ultra 9 288V

Started by Redaktion, January 06, 2026, 04:30:26

Previous topic - Next topic

Redaktion

AMD has officially launched the "Gorgon Point" Ryzen AI 400 APUs at CES 2026. Featuring up to 12x Zen 5/5c cores and a Radeon 890M iGPU, the Ryzen AI 400 chips are meant to strike a balance between performance and power efficiency.

https://www.notebookcheck.net/Ryzen-AI-400-Gorgon-Point-APUs-are-here-AMD-claims-29-faster-multitasking-and-12-faster-gaming-vs-Core-Ultra-9-288V.1197454.0.html

rip RDNA4 iGPU this year

Ryzen AI 400 is just a rename of last year's Ryzen AI 300 (just like Phoenix to Hawk Point).

No direct comparison with the predecessor? Guess the higher clocks of the rename aren't high enough to justify an embarrassing direct comparison.

QuoteAMD is once again using the Radeon 800M series iGPUs for the Ryzen AI 400 series.
AMD upgraded the RDNA2 iGPU to RDNA3, but why not this time to RDNA4? One would think that the much improved hardware-based ML upscaling, and finally competitive with DLSS, would certainly make sense for power limited mobile devices (720p to 1080p upscaling)?

heffeque

Intel will surely catch up on the GPU side, and is already superior to AMD on battery life, so... AMD can't afford to rest on their laurels.


Arc

12% better gaming while Arc B390 is over 80% faster. That's without even factoring Intel's fine wine drivers, so by after a year's worth of updates maybe closer to 90%.

Quote from: RobertJasiek on January 06, 2026, 18:18:53AMD has never missed a chance to miss a chance!

+1, that should be their company slogan from now on. :P

Quote from: rip RDNA4 iGPU this year on January 06, 2026, 09:35:15AMD upgraded the RDNA2 iGPU to RDNA3, but why not this time to RDNA4?

They likely want to focus their resources on AI / Data Center for the next 18 months. So wont plan on upgrading their igpus until end of 2027.

What makes it suck even more is that rdna2 -> rdna3 -> rdna3.5, weren't at all that big of updates to begin with. The only reason the igpus got any faster was mostly because they were increasing the CUs. So this feels like 5 years of stagnation.

rip RDNA4 iGPU in 2026

Ye, what Intel showed looks good, if true also with regards to power efficiency.

QuoteWhat makes it suck even more is that rdna2 -> rdna3 -> rdna3.5, weren't at all that big of updates to begin with.
They aren't indeed, it's like RDNA3 iGPU is 10% faster than RDNA2 and RDNA3.5 is below 10% faster over RDNA3.

QuoteThe only reason the igpus got any faster was mostly because they were increasing the CUs. So this feels like 5 years of stagnation.
680M: 12 CUs, 768:48:32:12
780M: 12 CUs, 768:48:24:12
880M: 12 CUs, 768:48:24:12
890M: 16 CUs, 1024:64:32:16
(note that 16 CUs are not 33% faster than 12 CUs, but quite a bit less, and 12 CUs are not 50% faster than 8 CUs)

But I agree it feels totally like stagnation. Maybe they are limited by bandwidth, as benchmarks of 5600 MT/s SODIMMs vs soldered LPDDR5X-6400 MT/s show a performance increase:

Youtube:
  • "AMD Radeon 680M iGPU: DDR5-4800 vs LPDDR5-6400 Gaming Performance Comparison" <- doing this calc: 60 FPS*(6400/4800) = 80 FPS, but in the vid, only about 70 FPS is measured, which is unfortunate and not a linear scaling, maybe at that bandwidth there are not enough CUs, latency issue, ...
  • "DDR5 SODIMM Memory Scaling 4800Mhz 5600Mhz Comparison [..] AMD 6900HX Benchmarks Gaming"

Strix Halo runs at LPDDR5X-8000 and Apple's M5 uses 9600 MT/s (so this memory is also available), but not sure at which MT/s Strix Point runs at. Of course, the disadvantage is that LPDDR5X memory is soldered, but LPCAMM2 exists fix this.

Imagine the Intel+Nvidia APU is good..welp AMD.

rip RDNA4 iGPU in 2026

Actually, nevermind, the 76% faster figure is using 2x upscaling..

Intel Pulls an NVIDIA: youtu.be/8wNnLtsxNkY?t=658 --Gamers Nexus

rip RDNA4 iGPU in 2026

Unfair comparison by Intel, but Panther Lake may still be better than Strix Point?:
"Did Intel Just Cook AMD at CES 2026? (Zen 6, 9850X3D, Panther Lake Analysis)
Moore's Law Is Dead": youtu.be/q9bVaq499eE?t=787

Arc

They also show results with native rendering but like you said we will see.

Personally, it's not even about the chip for me but the laptop. I like how Lenovo this year is giving legion go 2 treatment as an option for many of their consumer laptops this year (OLED VRR 120Hz 1100+ nits displays).

I need a super bright display, can't go back to 500 nits anymore.

The two most interesting laptops that have been shown thus far for me at this CES have both been from Lenovo. Yoga Slim 7i Ultra Aura Edition (Panther Lake) and Yoga Slim 7x (Snapdragon X2 Elite).

At first, I wasn't initially even looking at the Snapdragon but then I saw the price difference. It's almost $600 cheaper than the panther lake yoga. If someone can get decent egpu drivers working on this thing, it may become a formidable alternative. Especially if it can show substantial gains in thermals, noise, and battery life in some cases. That's really what most people care about on a mobile device.

Also not sure, it's a good idea spending $1600 on something with an intel badge when n1x is rumoured to be launching soon. Intel GPU driver support has not had a good historical record when it came to longer term support compared to AMD/Nvidia.

rip RDNA4 iGPU in 2026

QuoteThey also show results with native rendering but like you said we will see.
You mean this?
notebookcheck.net/Intel-unveils-Panther-Lake-iGPU-with-lofty-performance-gains-over-Lunar-Lake.1134160.0.html
notebookcheck.net/Intel-Panther-Lake-CPUs-bring-up-to-16-new-cores-30-better-efficiency-and-next-gen-Xe3-iGPU.1134710.0.html
I wasn't aware that Intel posted this so much earlier XD (thought only it was leaks)
Yeah, alright, it looks promising, but again, independent tests will have to confirm it.

QuoteI need a super bright display, can't go back to 500 nits anymore.
Very interesting, can you tell why? I read that like 150 nits indoors is enough, you must be using it outdoors or right next to a window? (I also heard that 400 nits matte IPS = 500 nits glossy OLED)

I remember notebookcheck tested the first Snapdragon X Elite gen and many games were unstable and some didn't even start. If Snapdragon X2 Elite fixed this and the performance, battery life and the general software support is up there, it could be a viable alternative. Same for N1X APU.

QuoteIntel GPU driver support has not had a good historical record when it came to longer term support compared to AMD/Nvidia.
GPU is hard, especially the drivers I assume, but if Intel would give up now, they better know what they are doing. But since they showed a new GPU arch gen, it fortunately doesn't look like they are giving up.

Quick Reply

Name:
Email:
Verification:
Please leave this box empty:
Shortcuts: ALT+S post or ALT+P preview