News:

Willkommen im Notebookcheck.com Forum! Hier können sie über alle unsere Artikel und allgemein über Notebook relevante Dinge disuktieren. Viel Spass!

Main Menu

Lenovo ThinkPad P1 16 Gen 8 review: Tandem OLED series premiere

Started by Redaktion, December 11, 2025, 01:29:53

Previous topic - Next topic

veraverav

Quote from: anzej on December 13, 2025, 15:43:17I'd strongly recommend choosing a model with the dGPU, even if you don't need it. The non-dGPU models come with a smaller heatsink without liquid metal that fails to cover all the VRMs. This causes VRM overheating, which severely limits CPU package power.

You can verify this yourself by comparing spare part images for 5H41R89131 (non-dGPU) versus 5H41R89134 (dGPU).

I experienced this exact problem on my P1 G5 without a dGPU, as it could only sustain 38W CPU package power long-term due to VRM thermal issues. Lenovo Premier Support confirmed the design flaw and offered a full refund, explicitly recommending I purchase a dGPU-equipped unit instead.

TL;DR: Get the dGPU model regardless of your graphics needs if you want to get decent sustained CPU performance.

Didn't know this but glad I didn't go for the iGPU only model. Discount of 45%+ was on the dGPU model, so Lenovo made the correct choice for me :) Thanks for posting this. Learned something new.

pelican-freeze

#16
Are there any issues with the non-standard 3.2K resolution on a laptop intended for workstation use? I like the idea of a tandem OLED, but at 3.2K it seems like it might be more of a headache than anything else. At 1440p you get better battery life, cleaner application compatibility, and more flexibility for some light gaming, while at 3.2K you lose the battery life benefits of a lower resolution and risk app compatibility issues, without getting the content-consumption advantages of a proper 4K panel.

If the appeal of a tandem OLED workstation is being able to watch 4K HDR Netflix in your downtime, any sub‑4K display is still going to get the same 1080p stream (just like an iPad or Android tablet). To get true 4K HDR streaming you need a full 4K, HDCP 2.2‑compliant display; otherwise, you're capped at 1080p. So a 3.2k tandem OLED is objectively a poor solution for content consumption.

I'm interested in this laptop, but I'm concerned about weird interface scaling in professional apps that almost certainly haven't been tested for usability at 3200×2000. And having a tandem OLED that isn't suited for content consumption just feels like an awkward choice when Lenovo already offers such bright IPS panels - like with the 3.2k tandem OLED you get poor battery life and poor content consumption, but without a material increase in brightness to improve display visibility when in bright environments on the go since the P1 G8's 4k IPS panel option already offers 800 nits of SDR brightness. In practical SDR use the IPS panel is likely to be even brighter and offer even better visibility?

Like I'm a huge P1 / X1 Extreme fan and was excited about tandem OLED, but I can't figure out any use case where this 3.2k display makes sense since it appears to offer a big bag of frustrating compromises without any material benefits - bad for app compatibility, too high resolution for good battery life, too high for gaming, not high enough for good content consumption, no advantage over bright IPS option for outdoor use etc..

For those considering the P16 G3: you can configure CPUs/GPUs that, on paper, should scale to higher performance with more wattage, but the AC adapter is only 180W (vs. the P1 G8's 140W). That strongly suggests the P16 G3 chassis and power delivery are not designed to let parts like Intel HX CPUs or 5070 Ti/5080/5090‑class GPUs (RTX Pro 3000/4000/5000) run anywhere near their maximum performance. If your main interest in 5080/5090‑class GPUs is the extra NVENC encoders, the P16 G3 still looks great—much lighter and more compact than the P16 G2, with the tradeoff of lower cooling capacity, a smaller total power envelope, and reduced peak performance.

The only true 16‑inch performance workstation left this year is the Dell Pro Max 16 Plus (not Premium), since it ships with a 280W AC adapter and a much more performance‑oriented triple‑fan, vapor‑chamber cooling system. So if you care primarily about performance, the sweet spot is the Dell Pro Max 16 Plus with a 275HX (24 threads, fed full wattage) and a 5070 Ti‑class GPU (RTX 3000, at a full 175W), with the higher GPU tiers really only making sense for heavy video work. It's definitely not as compact as the P16 G3 and it's about a pound heavier once you factor in the adapter, but that seems like a very reasonable trade for the additional performance considering that there isn't really any competition at that mobile performance tier anymore.

pelican-freeze

#17
Quote from: veraverav on December 11, 2025, 19:24:06
Quote from: 2k for 8GB VRAM gg on December 11, 2025, 10:08:06
Quote8 GB VRAM
2000 for only 8 GB VRAM? Nice trolling.
Even games have a problem with only 8 GB VRAM: youtube.com/watch?v=ric7yb1VaoA: "Gaming Laptops are in Trouble - VRAM Testing w/ ‪@Hardwareunboxed‬"
Most big games are made for consoles first in mind and the PS5 has 16 GB VRAM, minus 4 GB for the OS, and games expect your GPU to have at least 12 GB VRAM.
Running local LLMs / AI has been a thing for a few years now, using llama.cpp and its webUI is all you need. A LLM can be fully loaded into the GPU's VRAM or, if the LLM can't fit, parts of it can be offloaded to system RAM. This laptop has 32 GB RAM + 8 GB VRAM. Small and better capable, big open-weights LLMs exist and the more RAM+VRAM your PC has, the better. Every GB helps. So, from 8 GB to 12 GB to 16 GB VRAM would already be a good to very good improvement.

But its not really meant to be a gaming laptop. Would plugging in an eGPU resolve this bottleneck for someone that absolutely has to game?

Plugging in an eGPU wouldn't resolve the issue as the P1 G8's CPU options are all low wattage chips configured with a maximum of 6 performance cores. For an eGPU to make sense on the Intel side you'd need at least a 255HX (8P+12E), or, on the AMD side a chip like a Ryzen AI 9 HX 370 (8P+4E). It's hard to overstate the degree to which lower power mobile CPUs right now are not a good fit for gaming - mobile GPUs are just cut down so dramatically from their desktop GPU counterparts though that this is less of an issue when pairing mobile CPUs with mobile GPUs. Pairing mobile CPUs with desktop GPUs you're going to run into serious bottlenecks if you don't pick the right chip - at least 8 performance cores, more watts, more performance scaling ability when feds those extra watts etc..

TLDR: If you want the option to use an eGPU with a thinkpad workstation buy a P16 Gen 3 with a 255HX (or higher). The P16 Gen 3 *should* offer even better CPU performance when it doesn't need to allocate thermal headroom to cooling the internal discreet GPU. This would be the best fit - high wattage desktop-ish class CPU actually running at reasonably high / close to max wattage in the laptop itself, eGPU also running at reasonably high / close to max wattage outside the laptop chassis.

Worgarthe

Quote from: pelican-freeze on December 14, 2025, 14:48:36Are there any issues with the non-standard 3.2K resolution on a laptop intended for workstation use? I like the idea of a tandem OLED, but at 3.2K it seems like it might be more of a headache than anything else. At 1440p you get better battery life, cleaner application compatibility, and more flexibility for some light gaming, while at 3.2K you lose the battery life benefits of a lower resolution and risk app compatibility issues, without getting the content-consumption advantages of a proper 4K panel.
There is absolutely nothing wrong with this resolution (or any other). Scaling exist for that exact reason. Increased power consumption comes from the tech, not a resolution - a 16" 1200p 120 Hz OLED is going to consume far more power than a 16" 2400p 120 Hz IPS, for example.

Quote from: pelican-freeze on December 14, 2025, 15:28:27Plugging in an eGPU wouldn't resolve the issue as the P1 G8's CPU options are all low wattage chips configured with a maximum of 6 performance cores. For an eGPU to make sense on the Intel side you'd need at least a 255HX (8P+12E), or, on the AMD side a chip like a Ryzen AI 9 HX 370 (8P+4E). It's hard to overstate the degree to which lower power mobile CPUs right now are not a good fit for gaming - mobile GPUs are just cut down so dramatically from their desktop GPU counterparts though that this is less of an issue when pairing mobile CPUs with mobile GPUs. Pairing mobile CPUs with desktop GPUs you're going to run into serious bottlenecks if you don't pick the right chip - at least 8 performance cores, more watts, more performance scaling ability when feds those extra watts etc..

TLDR: If you want the option to use an eGPU with a thinkpad workstation buy a P16 Gen 3 with a 255HX (or higher). The P16 Gen 3 *should* offer even better CPU performance when it doesn't need to allocate thermal headroom to cooling the internal discreet GPU. This would be the best fit - high wattage desktop-ish class CPU actually running at reasonably high / close to max wattage in the laptop itself, eGPU also running at reasonably high / close to max wattage outside the laptop chassis.
You knowledge about eGPU and how eGPU work is clearly somewhere between zero and nothing, nhf. Literally this whole comment is wrong.

Check egpu.io for more info and to learn something instead of being confidently wrong: https://egpu.io/best-external-graphics-card-builds/

Or r/eGPU: https://www.reddit.com/r/eGPU/

Or a 30 Watt CPU handheld - Legion Go with eGPU (fairly popular combo): https://www.youtube.com/watch?v=5opYdgDtK0s

Or, heck, even an 5-year old ThinkPad with a 15W CPU (with UHD 620 integrated graphics) + RTX 4090: https://www.youtube.com/watch?v=EOsGqAeyCtA

Or... Basically I can keep going endlessly.

Tl; dr - the truth is polar opposite of your comment.

Edit: Forgot to include this, from Notebookcheck:

X1 Carbon Gen 6 (15W CPU) tested with eGPU, back in 2018: https://www.notebookcheck.net/Lenovo-ThinkPad-X1-Carbon-2018-WQHD-HDR-i7-Laptop-Review.284682.0.html

pelican-freeze

Quote from: Worgarthe on December 14, 2025, 16:34:16
Quote from: pelican-freeze on December 14, 2025, 14:48:36Are there any issues with the non-standard 3.2K resolution on a laptop intended for workstation use? I like the idea of a tandem OLED, but at 3.2K it seems like it might be more of a headache than anything else. At 1440p you get better battery life, cleaner application compatibility, and more flexibility for some light gaming, while at 3.2K you lose the battery life benefits of a lower resolution and risk app compatibility issues, without getting the content-consumption advantages of a proper 4K panel.
There is absolutely nothing wrong with this resolution (or any other). Scaling exist for that exact reason. Increased power consumption comes from the tech, not a resolution - a 16" 1200p 120 Hz OLED is going to consume far more power than a 16" 2400p 120 Hz IPS, for example.

Quote from: pelican-freeze on December 14, 2025, 15:28:27Plugging in an eGPU wouldn't resolve the issue as the P1 G8's CPU options are all low wattage chips configured with a maximum of 6 performance cores. For an eGPU to make sense on the Intel side you'd need at least a 255HX (8P+12E), or, on the AMD side a chip like a Ryzen AI 9 HX 370 (8P+4E). It's hard to overstate the degree to which lower power mobile CPUs right now are not a good fit for gaming - mobile GPUs are just cut down so dramatically from their desktop GPU counterparts though that this is less of an issue when pairing mobile CPUs with mobile GPUs. Pairing mobile CPUs with desktop GPUs you're going to run into serious bottlenecks if you don't pick the right chip - at least 8 performance cores, more watts, more performance scaling ability when feds those extra watts etc..

TLDR: If you want the option to use an eGPU with a thinkpad workstation buy a P16 Gen 3 with a 255HX (or higher). The P16 Gen 3 *should* offer even better CPU performance when it doesn't need to allocate thermal headroom to cooling the internal discreet GPU. This would be the best fit - high wattage desktop-ish class CPU actually running at reasonably high / close to max wattage in the laptop itself, eGPU also running at reasonably high / close to max wattage outside the laptop chassis.
You knowledge about eGPU and how eGPU work is clearly somewhere between zero and nothing, nhf. Literally this whole comment is wrong.

Check egpu.io for more info and to learn something instead of being confidently wrong: https://egpu.io/best-external-graphics-card-builds/

Or r/eGPU: https://www.reddit.com/r/eGPU/

Or a 30 Watt CPU handheld - Legion Go with eGPU (fairly popular combo): https://www.youtube.com/watch?v=5opYdgDtK0s

Or, heck, even an 5-year old ThinkPad with a 15W CPU (with UHD 620 integrated graphics) + RTX 4090: https://www.youtube.com/watch?v=EOsGqAeyCtA

Or... Basically I can keep going endlessly.

Tl; dr - the truth is polar opposite of your comment.

Edit: Forgot to include this, from Notebookcheck:

X1 Carbon Gen 6 (15W CPU) tested with eGPU, back in 2018: https://www.notebookcheck.net/Lenovo-ThinkPad-X1-Carbon-2018-WQHD-HDR-i7-Laptop-Review.284682.0.html


Honestly, just read what I wrote? Nothing in your comment appears to be a direct response to anything I said. Obviously you can game with lower wattage CPUs and eGPUs - you'll just get worse performance. And obviously if you want to avoid your eGPU being performance limited / bottlenecked by your CPU choice you can just choose a CPU that is a better fit for gaming.

Spending $$$$ on a high wattage desktop GPU that you know will be performance constrained by a low wattage mobile CPU is clearly not going to provide the best experience but no one is stopping you from doing it if you want. Also, you can't put a higher wattage mobile CPU intended for gaming use (255HX+) into a 4 lb ultraportable laptop so if your goal is "ultraportable laptop that can play games at home" that rules out the higher wattage enthusiast class CPUs that are designed for gaming / workstation laptop use.

So there's nothing stopping you from using an eGPU / mobile CPU combo where the CPU is the performance bottleneck. It's clearly not possible to argue though that the best mobile CPU choices for gaming aren't the higher wattage mobile CPUs *explicitly* designed and marketed by Intel / AMD as mobile gaming CPUs.

Worgarthe

Quote from: pelican-freeze on December 14, 2025, 19:00:29Honestly, just read what I wrote? Nothing in your comment appears to be a direct response to anything I said. Obviously you can game with lower wattage CPUs and eGPUs - you'll just get worse performance. And obviously if you want to avoid your eGPU being performance limited / bottlenecked by your CPU choice you can just choose a CPU that is a better fit for gaming.
Worse performance, yet still significantly better than what you get with gaming on iGPU or dGPU in that same laptop (if a GPU in an eGPU dock/enclosure is more powerful, naturally).

Quote from: pelican-freeze on December 14, 2025, 19:00:29Spending $$$$ on a high wattage desktop GPU that you know will be performance constrained by a low wattage mobile CPU is clearly not going to provide the best experience but no one is stopping you from doing it if you want. Also, you can't put a higher wattage mobile CPU intended for gaming use (255HX+) into a 4 lb ultraportable laptop so if your goal is "ultraportable laptop that can play games at home" that rules out the higher wattage enthusiast class CPUs that are designed for gaming / workstation laptop use.


Quote from: pelican-freeze on December 14, 2025, 19:00:29So there's nothing stopping you from using an eGPU / mobile CPU combo where the CPU is the performance bottleneck. It's clearly not possible to argue though that the best mobile CPU choices for gaming aren't the higher wattage mobile CPUs *explicitly* designed and marketed by Intel / AMD as mobile gaming CPUs.
The CPU is always a bottleneck, no matter of TDP. If you check any of the links above, for example that video with the Legion Go + 3070 eGPU, you will see that the CPU usage is basically identical in all scenarios - with or without eGPU; for example, as clearly demonstrated in the video, in Final Fantasy 7 Rebirth the CPU at 1080p low is at 21-24% to get 26 average fps with iGPU, yet once the 3070 eGPU is plugged in there is 75 average fps at 1440p high settings with the CPU usage at exactly the same 21-24%.

I have 5070 Ti, when I try to play Shadow of the Tomb Raider for example (I'm actually currently replaying the reboot series) with my X1 Carbon Gen 9 and its iGPU (i7 1165G7, Iris Xe) at absolutely lowest possible settings at 720p, I get barely 30 fps with 55-60% CPU usage. When I plug in my 5070 Ti I get 160+ fps at highest/maxed settings without Ray Tracing and 80-100 with Ray Tracing, all at 1440p, and the CPU is still at 55-60% usage.

Which is more than what I get with my P16 Gen 2 (i7 14700HX + ADA 3500 12 GB), where I get 110-120 at the same highest/maxed settings without RT, and about 55-70 with RT on. With eGPU plugged in to that same P16 Gen 2, the same 5070 Ti is pushing 170-ish at maxed settings, and around 100-110 with same settings + Ray Tracing on. All 1440p, of course.

Is a faster CPU faster than a slower CPU? Yes. Does wasting away insane amounts of money for top specs makes sense when a 750€-ish GPU is still going to obliterate those specs even with slightly reduced bandwidth due to its eGPU setup? Well, that's up to each person to decide how much they hate their own money ¯\_(ツ)_/¯

Again, for clarity:

Quote from: Worgarthe on December 12, 2025, 04:44:47I just checked prices in Germany, but for the P16 Gen 3 because it is possible to equip it with up to 24 GB VRAM (Blackwell 5000).

The base version 8 GB (Blackwell 1000) config goes for 2819 € currently. Prices for GPU upgrades are the following:

  • 2000 Blackwell (8 GB) +230€
  • 3000 Blackwell (12 GB) +790€
  • 4000 Blackwell (16 GB) +1420€
  • 5000 Blackwell (24 GB) +2980€ (😂)

The rest of the specs is untouched from the base config, so 245HX + 16 GB RAM + 512 GB SSD, with only the display being automatically improved to a 2400p panel (no option to keep the base 1200p panel with that GPU).

So I just added that GPU and literally nothing else, the laptop is now 6039€, that's an increase of +3220€ (!!) just because of 24 GB GPU!

Let's see how much VRAM we can get with 3220€, to put that in an eGPU setup while keeping the P16 Gen 3 at its base price: https://www.idealo.de/preisvergleich/OffersOfProduct/205942083_-geforce-rtx-5070-ti-gigabyte.html

3220/759=4. So four RTX 5070 Ti. Meaning 64 GB of VRAM. Meaning 40 GB more, for the same price. The 5000 Blackwell is similar in performance as a 4070 Super and 3080 Ti. The 5070 Ti is simply far ahead performance-wise, and with four of them at full power of 300W each - it's even more tragic to compare of what you get for the same amount of money... ¯\_(ツ)_/¯

pelican-freeze

#21
Quote from: Worgarthe on December 14, 2025, 19:37:09
Quote from: pelican-freeze on December 14, 2025, 19:00:29Honestly, just read what I wrote? Nothing in your comment appears to be a direct response to anything I said. Obviously you can game with lower wattage CPUs and eGPUs - you'll just get worse performance. And obviously if you want to avoid your eGPU being performance limited / bottlenecked by your CPU choice you can just choose a CPU that is a better fit for gaming.
Worse performance, yet still significantly better than what you get with gaming on iGPU or dGPU in that same laptop (if a GPU in an eGPU dock/enclosure is more powerful, naturally).

Quote from: pelican-freeze on December 14, 2025, 19:00:29Spending $$$$ on a high wattage desktop GPU that you know will be performance constrained by a low wattage mobile CPU is clearly not going to provide the best experience but no one is stopping you from doing it if you want. Also, you can't put a higher wattage mobile CPU intended for gaming use (255HX+) into a 4 lb ultraportable laptop so if your goal is "ultraportable laptop that can play games at home" that rules out the higher wattage enthusiast class CPUs that are designed for gaming / workstation laptop use.


Quote from: pelican-freeze on December 14, 2025, 19:00:29So there's nothing stopping you from using an eGPU / mobile CPU combo where the CPU is the performance bottleneck. It's clearly not possible to argue though that the best mobile CPU choices for gaming aren't the higher wattage mobile CPUs *explicitly* designed and marketed by Intel / AMD as mobile gaming CPUs.
The CPU is always a bottleneck, no matter of TDP. If you check any of the links above, for example that video with the Legion Go + 3070 eGPU, you will see that the CPU usage is basically identical in all scenarios - with or without eGPU; for example, as clearly demonstrated in the video, in Final Fantasy 7 Rebirth the CPU at 1080p low is at 21-24% to get 26 average fps with iGPU, yet once the 3070 eGPU is plugged in there is 75 average fps at 1440p high settings with the CPU usage at exactly the same 21-24%.

I have 5070 Ti, when I try to play Shadow of the Tomb Raider for example (I'm actually currently replaying the reboot series) with my X1 Carbon Gen 9 and its iGPU (i7 1165G7, Iris Xe) at absolutely lowest possible settings at 720p, I get barely 30 fps with 55-60% CPU usage. When I plug in my 5070 Ti I get 160+ fps at highest/maxed settings without Ray Tracing and 80-100 with Ray Tracing, all at 1440p, and the CPU is still at 55-60% usage.

Which is more than what I get with my P16 Gen 2 (i7 14700HX + ADA 3500 12 GB), where I get 110-120 at the same highest/maxed settings without RT, and about 55-70 with RT on. With eGPU plugged in to that same P16 Gen 2, the same 5070 Ti is pushing 170-ish at maxed settings, and around 100-110 with same settings + Ray Tracing on. All 1440p, of course.

Is a faster CPU faster than a slower CPU? Yes. Does wasting away insane amounts of money for top specs makes sense when a 750€-ish GPU is still going to obliterate those specs even with slightly reduced bandwidth due to its eGPU setup? Well, that's up to each person to decide how much they hate their own money ¯\_(ツ)_/¯

Again, for clarity:

Quote from: Worgarthe on December 12, 2025, 04:44:47I just checked prices in Germany, but for the P16 Gen 3 because it is possible to equip it with up to 24 GB VRAM (Blackwell 5000).

The base version 8 GB (Blackwell 1000) config goes for 2819 € currently. Prices for GPU upgrades are the following:

  • 2000 Blackwell (8 GB) +230€
  • 3000 Blackwell (12 GB) +790€
  • 4000 Blackwell (16 GB) +1420€
  • 5000 Blackwell (24 GB) +2980€ (😂)

The rest of the specs is untouched from the base config, so 245HX + 16 GB RAM + 512 GB SSD, with only the display being automatically improved to a 2400p panel (no option to keep the base 1200p panel with that GPU).

So I just added that GPU and literally nothing else, the laptop is now 6039€, that's an increase of +3220€ (!!) just because of 24 GB GPU!

Let's see how much VRAM we can get with 3220€, to put that in an eGPU setup while keeping the P16 Gen 3 at its base price: https://www.idealo.de/preisvergleich/OffersOfProduct/205942083_-geforce-rtx-5070-ti-gigabyte.html

3220/759=4. So four RTX 5070 Ti. Meaning 64 GB of VRAM. Meaning 40 GB more, for the same price. The 5000 Blackwell is similar in performance as a 4070 Super and 3080 Ti. The 5070 Ti is simply far ahead performance-wise, and with four of them at full power of 300W each - it's even more tragic to compare of what you get for the same amount of money... ¯\_(ツ)_/¯

I feel like you're making my point for me. Reading your post implies that the best option would be a P16 Gen 3 with a 255HX gaming / workstation CPU and the cheapest GPU option (1000 Blackwell) combined with a desktop 5070 TI in an eGPU setup?

At no point have I said that a high wattage eGPU setup won't provide better performance compared to a low wattage discreet mobile GPU, even when connected to an ultraportable laptop.

Also, if you primarily want to play older titles like Shadow of the Tomb Raider that were released before the current console generation than it goes without saying that they will be much less CPU intensive than current generation games. I can absolutely see the CPU bottleneck being less of an issue for gaming if you primarily play games released before 2020 / before the PlayStation 5 & Xbox Series X were both released with 8 performance CPU cores.

But why would you pay $$$$ for a desktop 5070 TI if you only wanted to play older games?

Worgarthe

Quote from: pelican-freeze on December 14, 2025, 19:57:35I feel like you're making my point for me. Reading your post implies that the best option would be a P16 Gen 3 with a 255HX gaming / workstation CPU and the cheapest GPU option (1000 Blackwell) combined with a desktop 5070 TI in an eGPU setup?
The best option is any Oculink or Thunderbolt 5 laptop, currently there's not that many of them to pick. So yes, the P16 Gen 3 is one of those few, that's correct. The best one is the one with the best thermal "picture", because if it can't hold stable PL1 Wattage and frequencies it will be all over the place, which is why the Razer Blade 16 is probably quite meh as its CPU is like watching a heart monitor graph. Can't wait for an in-depth P16 Gen 3 review though!

Quote from: pelican-freeze on December 14, 2025, 19:57:35At no point have I said that a high wattage eGPU setup won't provide better performance compared to a low wattage discreet mobile GPU, even when connected to an ultraportable laptop.

Also, if you primarily want to play older titles like Shadow of the Tomb Raider that were released before the current console generation than it goes without saying that they will be much less CPU intensive than current generation games. I can absolutely see the CPU bottleneck being less of an issue for gaming if you primarily play games released before 2020 / before the PlayStation 5 & Xbox Series X were both released with 8 performance CPU cores.
Yes, you are correct with this, the part in bold especially. Six cores is proven to be enough for all games, in laptops, in desktops, perfectly enough. The 9600X is a monster of a CPU, yet inexpensive, and it's trailing just about 10% behind the 9800X3D which is more expensive but also slightly faster due to its fast cache (96 MB L3 vs 32 MB L3 in the 9600X). But if you compare the 9700X with its 8 cores to that same 9600X with its 6 cores - they are pretty much identical, down to a single fps, like getting 165 fps in Battlefield 6 with 9700X vs 164 fps with 9600X. If two more cores are needed to get just 1 extra fps...

Same applies to laptops - more cache is better than more cores, as long as those cores are 6 performance cores. TDP is insignificant as long as it's a fairly modern architecture with modern/fast IPC, and as long as it can hold PL1 without thermal throttling. You absolutely won't get any significant in-game difference (no matter of how intensive a game is) with a flat 30-35W PL1 6-8 core CPU vs a flat 90W PL1 20 core HX CPU. Again, if getting, for example, a 220 fps average with 98 1% lows is that much necessary over a 207 fps average with 96% 1% low to justify paying huge premium just to play games then that's up to you, but you simply won't notice any difference outside of benchmarks and Afterburner overlay. For other stuff, more important stuff outside of gaming, faster and more powerful CPU with as many cores as possible is clearly far better pick though, even with eGPU (local LLM, rendering, video production etc.), one can't argue against that. But for games six fast(er) cores is superior to, say, 8-12 slow(er) cores, any day, any game.

Quote from: pelican-freeze on December 14, 2025, 19:57:35But why would you pay $$$$ for a desktop 5070 TI if you only wanted to play older games?
I play many games just fine, actually just finished Doom The Dark Ages recently after getting it on a nice 50% discount on Black Friday. That's one of the most demanding games of 2025, "ran" it at a literal 2 fps on my X1 Carbon with its iGPU, but it was a decent 45-ish fps experience with 5070 Ti at 1440p DLSS Q. However, older CPU with just 4 cores with less IPC and lower clocks in general simply struggles to push more than that, even when added more Watts to it (PL1 set to 29W instead of 22). It was at 95-100% usage at 2 fps and 45-ish fps, btw 😁

My P16 G2 was running it natively (1600p DLSS Q) at about 43-48 fps, with its Ada 3500. Impressive? Meh, not really given the TDP, TGP and overall performance difference to my three years older X1 Carbon. But with 5070 Ti eGPU it was pushing 88-96 fps maxed settings 1440p without DLSS, and 210+ fps maxed 1440p with Frame Gen (didn't test FG on my X1C though). The 14700HX was at about 50% usage in both cases (Ada 3500 and eGPU 5070 Ti). That's basically identical performance as what you get with a 9800X3D + 5070 Ti desktop.

And to finish this rambling - the absolute top config for the P16 Gen 2 would come with an RTX 5000 (also Ada, not Blackwell). That would be insanely costly to get, like about twice more expensive than what I paid for mine, and it would still get obliterated in games by a much cheaper (~780€ when I bought it) 5070 Ti in an eGPU enclosure ¯\_(ツ)_/¯

To answer your question about older game(s)... Because two new Tomb Raider games were announced two days ago - https://store.steampowered.com/news/app/203160/view/559139291408630486 - and as a fan of the series I went to replay again the Rise (2016 game) and the Shadow (2018 game), while waiting for an upcoming 2026 game. I even replayed a whole Half Life 1 and Half Life 2 recently, because of those intensified Half Life 3 rumours, and those can run at some super-high fps on probably any Celeron or Atom with their weakest iGPUs 🙂

veraverav

Quote from: pelican-freeze on December 14, 2025, 14:48:36Are there any issues with the non-standard 3.2K resolution on a laptop intended for workstation use? I like the idea of a tandem OLED, but at 3.2K it seems like it might be more of a headache than anything else. At 1440p you get better battery life, cleaner application compatibility, and more flexibility for some light gaming, while at 3.2K you lose the battery life benefits of a lower resolution and risk app compatibility issues, without getting the content-consumption advantages of a proper 4K panel.

If the appeal of a tandem OLED workstation is being able to watch 4K HDR Netflix in your downtime, any sub‑4K display is still going to get the same 1080p stream (just like an iPad or Android tablet). To get true 4K HDR streaming you need a full 4K, HDCP 2.2‑compliant display; otherwise, you're capped at 1080p. So a 3.2k tandem OLED is objectively a poor solution for content consumption.

I'm interested in this laptop, but I'm concerned about weird interface scaling in professional apps that almost certainly haven't been tested for usability at 3200×2000. And having a tandem OLED that isn't suited for content consumption just feels like an awkward choice when Lenovo already offers such bright IPS panels - like with the 3.2k tandem OLED you get poor battery life and poor content consumption, but without a material increase in brightness to improve display visibility when in bright environments on the go since the P1 G8's 4k IPS panel option already offers 800 nits of SDR brightness. In practical SDR use the IPS panel is likely to be even brighter and offer even better visibility?

Like I'm a huge P1 / X1 Extreme fan and was excited about tandem OLED, but I can't figure out any use case where this 3.2k display makes sense since it appears to offer a big bag of frustrating compromises without any material benefits - bad for app compatibility, too high resolution for good battery life, too high for gaming, not high enough for good content consumption, no advantage over bright IPS option for outdoor use etc..

For those considering the P16 G3: you can configure CPUs/GPUs that, on paper, should scale to higher performance with more wattage, but the AC adapter is only 180W (vs. the P1 G8's 140W). That strongly suggests the P16 G3 chassis and power delivery are not designed to let parts like Intel HX CPUs or 5070 Ti/5080/5090‑class GPUs (RTX Pro 3000/4000/5000) run anywhere near their maximum performance. If your main interest in 5080/5090‑class GPUs is the extra NVENC encoders, the P16 G3 still looks great—much lighter and more compact than the P16 G2, with the tradeoff of lower cooling capacity, a smaller total power envelope, and reduced peak performance.

The only true 16‑inch performance workstation left this year is the Dell Pro Max 16 Plus (not Premium), since it ships with a 280W AC adapter and a much more performance‑oriented triple‑fan, vapor‑chamber cooling system. So if you care primarily about performance, the sweet spot is the Dell Pro Max 16 Plus with a 275HX (24 threads, fed full wattage) and a 5070 Ti‑class GPU (RTX 3000, at a full 175W), with the higher GPU tiers really only making sense for heavy video work. It's definitely not as compact as the P16 G3 and it's about a pound heavier once you factor in the adapter, but that seems like a very reasonable trade for the additional performance considering that there isn't really any competition at that mobile performance tier anymore.

Actually the 3.2K screen run @ 200% scaling will yield a better picture than 4K @ 250% scaling.

So far I've read zero complaints about watching 4k content on a 3.2k screen or anything like that. I guess someone with this laptop can chime in if they are unable to get a 4k stream because its a 3.2k screen. Everyone seems to love the 3.2k OLED screen.

As for the eGPU, it simply to get around the 8GB VRAM bottleneck without having to haul around something as large and heavy as the P16.

Worgarthe

Quote from: veraverav on December 14, 2025, 21:35:27Actually the 3.2K screen run @ 200% scaling will yield a better picture than 4K @ 250% scaling.
Correct. That's 1600x1000 (3.2K 200%) vs 1536x960 (4K 250%). Sure the 4K is a bit sharper but it's not really noticeable from any normal viewing distance. If one is touching the screen with their nose then they will probably notice some pixels here and there with a 3.2K, hehe. Another benefit of 3.2K is running it at 125% scaling, which is effectively the same real estate as 2560x1600, with 25% sharper image. 25% doesn't sound much, but if one works with text then it helps a lot as its easier on one's eyes no matter of how good their eyesight is.

Quote from: veraverav on December 14, 2025, 21:35:27So far I've read zero complaints about watching 4k content on a 3.2k screen or anything like that. I guess someone with this laptop can chime in if they are unable to get a 4k stream because its a 3.2k screen. Everyone seems to love the 3.2k OLED screen.
I don't have this exact panel, but I used to own a Surface Pro 8 for a few months with a 2880x1920, so not the same resolution but very close. No issues with 4K, it all worked great (replaced the Surf with an X1 Carbon 9 though).

Quote from: veraverav on December 14, 2025, 21:35:27As for the eGPU, it simply to get around the 8GB VRAM bottleneck without having to haul around something as large and heavy as the P16.
This 👍 You pay less for eGPU dock/enclosure + normal desktop GPU, and you get more performance (for far less money) than what you would get with that same laptop if its dGPU was maxed.

I mean, again, and this is now a rhetorical question - why in the world would one pay +2980€ to go from 245HX + Blackwell 1000 to 245HX + Blackwell 5000 (that previously linked P16 Gen 3 configuration on Lenovo's site) when a 740-780€ 5070 Ti is simply going to destroy that Blackwell 5000? ¯\_(ツ)_/¯
Sure if they simply NEED to play Battlefield 6 on battery when they commute to work then ok, go crazy and get that Blackwell 5000, or they can save money and go with Blackwell 4000 (16 GB) for "just" +1420€ (and get even more obliterated by a 740-780€ 5070 Ti), but yeah...

pelican-freeze

Quote from: veraverav on December 14, 2025, 21:35:27
Quote from: pelican-freeze on December 14, 2025, 14:48:36Are there any issues with the non-standard 3.2K resolution on a laptop intended for workstation use? I like the idea of a tandem OLED, but at 3.2K it seems like it might be more of a headache than anything else. At 1440p you get better battery life, cleaner application compatibility, and more flexibility for some light gaming, while at 3.2K you lose the battery life benefits of a lower resolution and risk app compatibility issues, without getting the content-consumption advantages of a proper 4K panel.

If the appeal of a tandem OLED workstation is being able to watch 4K HDR Netflix in your downtime, any sub‑4K display is still going to get the same 1080p stream (just like an iPad or Android tablet). To get true 4K HDR streaming you need a full 4K, HDCP 2.2‑compliant display; otherwise, you're capped at 1080p. So a 3.2k tandem OLED is objectively a poor solution for content consumption.

I'm interested in this laptop, but I'm concerned about weird interface scaling in professional apps that almost certainly haven't been tested for usability at 3200×2000. And having a tandem OLED that isn't suited for content consumption just feels like an awkward choice when Lenovo already offers such bright IPS panels - like with the 3.2k tandem OLED you get poor battery life and poor content consumption, but without a material increase in brightness to improve display visibility when in bright environments on the go since the P1 G8's 4k IPS panel option already offers 800 nits of SDR brightness. In practical SDR use the IPS panel is likely to be even brighter and offer even better visibility?

Like I'm a huge P1 / X1 Extreme fan and was excited about tandem OLED, but I can't figure out any use case where this 3.2k display makes sense since it appears to offer a big bag of frustrating compromises without any material benefits - bad for app compatibility, too high resolution for good battery life, too high for gaming, not high enough for good content consumption, no advantage over bright IPS option for outdoor use etc..

For those considering the P16 G3: you can configure CPUs/GPUs that, on paper, should scale to higher performance with more wattage, but the AC adapter is only 180W (vs. the P1 G8's 140W). That strongly suggests the P16 G3 chassis and power delivery are not designed to let parts like Intel HX CPUs or 5070 Ti/5080/5090‑class GPUs (RTX Pro 3000/4000/5000) run anywhere near their maximum performance. If your main interest in 5080/5090‑class GPUs is the extra NVENC encoders, the P16 G3 still looks great—much lighter and more compact than the P16 G2, with the tradeoff of lower cooling capacity, a smaller total power envelope, and reduced peak performance.

The only true 16‑inch performance workstation left this year is the Dell Pro Max 16 Plus (not Premium), since it ships with a 280W AC adapter and a much more performance‑oriented triple‑fan, vapor‑chamber cooling system. So if you care primarily about performance, the sweet spot is the Dell Pro Max 16 Plus with a 275HX (24 threads, fed full wattage) and a 5070 Ti‑class GPU (RTX 3000, at a full 175W), with the higher GPU tiers really only making sense for heavy video work. It's definitely not as compact as the P16 G3 and it's about a pound heavier once you factor in the adapter, but that seems like a very reasonable trade for the additional performance considering that there isn't really any competition at that mobile performance tier anymore.

Actually the 3.2K screen run @ 200% scaling will yield a better picture than 4K @ 250% scaling.

So far I've read zero complaints about watching 4k content on a 3.2k screen or anything like that. I guess someone with this laptop can chime in if they are unable to get a 4k stream because its a 3.2k screen. Everyone seems to love the 3.2k OLED screen.

As for the eGPU, it simply to get around the 8GB VRAM bottleneck without having to haul around something as large and heavy as the P16.


It's probably worth taking a moment to understand how 4k video streaming works. No mainstream streaming service will send a 4K video stream unless the device reports a 4K, HDCP 2.2 compliant display; anything below that gets a 1080p stream for DRM reasons.

The 3.2K panel will absolutely stream 1080p and may even get higher‑bitrate 1080p with HDR, but it cannot be served a true 4K stream. It simply isn't a 4K, HDCP 2.2 compliant display. That said 1080p on the 3.2k display can still be watchable - similar to how 1080p can look decent on a newer iPad Pro (another sub 4k tandem OLED panel) but it will never be actual 4K.

So with the P1 G8's display options you basically choose between:

  • The 4K IPS panel, which can receive full 4K streams (4× the pixels) but has lower contrast and no real HDR.
  • The 3.2K tandem OLED, with better blacks and true HDR but permanently capped at 1080p video from streaming services.

There's no workaround where a service sends 4K and you downscale it to 3.2K; the 3.2k display simply doesn't qualify for 4K streaming video playback under the DRM rules. You can, however, plug the P1 G8 with the 3.2K screen into an external 4K, HDCP 2.2–compliant monitor or TV and then stream full 4K on that external display.

Hopefully that makes sense.


EDIT: Disney+ is the one real outlier that I'm aware of here. Right now Disney+ doesn't serve 4K to PCs at all - no 4K in a browser and no 4K in the Windows app - 4K streams only get served to TVs and video streaming sticks.

So, since both the 3.2K tandem OLED and the 4K IPS panels would get the same 1080p stream from Disney+, that particular service will actually look better on the 3.2K tandem OLED thanks to its contrast and HDR. The only caveat is that Disney could change its platform policy at any time, because there's no technical reason preventing Disney+ from serving 4K video to the 4K IPS display in the future.

pelican-freeze

Quote from: Worgarthe on December 14, 2025, 21:47:40I mean, again, and this is now a rhetorical question - why in the world would one pay +2980€ to go from 245HX + Blackwell 1000 to 245HX + Blackwell 5000 (that previously linked P16 Gen 3 configuration on Lenovo's site) when a 740-780€ 5070 Ti is simply going to destroy that Blackwell 5000? ¯\_(ツ)_/¯
Sure if they simply NEED to play Battlefield 6 on battery when they commute to work then ok, go crazy and get that Blackwell 5000, or they can save money and go with Blackwell 4000 (16 GB) for "just" +1420€ (and get even more obliterated by a 740-780€ 5070 Ti), but yeah...

FWIW there's a much bigger difference between the higher-end mobile GPU options for everything that isn't gaming, so it can absolutely make sense to spend more on P16 Gen 3 GPU upgrades even if the chassis is TGP‑limited.

RTX Pro 2000 / 5060 mobile – 1 NVENC encoder, 128‑bit bus, 8 GB VRAM
RTX Pro 3000 / 5070 Ti mobile – 1 NVENC encoder, 192‑bit bus, 12 GB VRAM
RTX Pro 4000 / 5080 mobile – 2 NVENC encoders, 256‑bit bus, 16 GB VRAM
RTX Pro 5000 / 5090 mobile – 3 NVENC encoders, 256‑bit bus, 24 GB VRAM

For video editing specifically, the RTX Pro 4000 mobile is the real workstation sweet spot: it delivers desktop 5080‑class encoding (100% more NVENC encoders vs the RTX Pro 3000 / 5070 Ti mobile) while staying relatively reasonable on cost and power.

The RTX Pro 5000 / 5090 mobile can then give you essentially "5090‑class" video encoding throughput, since it has the same three NVENC encoders as the desktop 5090, but for the big price jump over the RTX Pro 4000 mobile you're only getting 50% more NVENC encoders and there's zero further improvement to the memory bus.

All of these mobile parts use the same single NVDEC decoder, so you don't gain extra decode hardware by moving up the stack. On desktop, the 5080 and 5090 each have two NVDEC decoders, which can help with heavier multi‑track timeline video decode. The desktop 5070 Ti desktop by comparison only has a single NVENC encoder / NVDEC decoder so it's a bit limited as a workstation GPU.

Quick Reply

Name:
Email:
Verification:
Please leave this box empty:
Shortcuts: ALT+S post or ALT+P preview