NotebookCHECK - Notebook Forum

English => Reviews => Topic started by: Redaktion on December 11, 2025, 01:29:53

Title: Lenovo ThinkPad P1 16 Gen 8 review: Tandem OLED series premiere
Post by: Redaktion on December 11, 2025, 01:29:53
The ThinkPad P1 Gen 8 may look the same as the previous generation model, but the new tandem OLED display option can improve the HDR viewing experience tremendously for those who value it.

https://www.notebookcheck.net/Lenovo-ThinkPad-P1-16-Gen-8-review-Tandem-OLED-series-premiere.1174965.0.html
Title: Re: Lenovo ThinkPad P1 16 Gen 8 review: Tandem OLED series premiere
Post by: jhe on December 11, 2025, 04:17:40
cant believe you sh1theads, FINALLY a matte oled screen after years of fcking mirrors and you put it under cons.
Title: Re: Lenovo ThinkPad P1 16 Gen 8 review: Tandem OLED series premiere
Post by: veraverav on December 11, 2025, 06:38:38
5.5 hrs battery life? Seems very low compared to reviews from consumers using this model.

Also, its a P1 Gen 8 not a P1 16 Gen 8.
Title: Re: Lenovo ThinkPad P1 16 Gen 8 review: Tandem OLED series premiere
Post by: 2k for 8GB VRAM gg on December 11, 2025, 10:07:57
Quotematte tandem OLED can appear slightly grainy up close
How to destroy the beautiful popping colors and the sharp text of an (glossy) OLED? Take grinding paper/stone and rub it on the screen, then you get the screen in this laptop.

Quoteone-year base warranty instead of three years
Wow

Quoteno ECC RAM
This is a big one. Since this is a workstation, at least having the option?
Theoretically, providing real ECC is cheap (ASUS supports real ECC in mainstream AM5 motherboards (other brands may support it unofficially) and most AMD CPUs support real ECC too), so this looks like an artificial market segmentation.

QuoteLPCAMM2
This is modern, at what MT/s is this running?
Looking at AIDA64' 96582 MB/s, it's running at roughly 7500 MT/s.

For running local LLMs, can the RAM be extended to 64 GB or more? If this is a workstation, support for up to 128 GB RAM would be expected, especially with all the for-RAM-optimized  MoE LLMs.
Title: Re: Lenovo ThinkPad P1 16 Gen 8 review: Tandem OLED series premiere
Post by: 2k for 8GB VRAM gg on December 11, 2025, 10:08:06
Quote8 GB VRAM
2000 for only 8 GB VRAM? Nice trolling.
Even games have a problem with only 8 GB VRAM: youtube.com/watch?v=ric7yb1VaoA: "Gaming Laptops are in Trouble - VRAM Testing w/ ‪@Hardwareunboxed‬"
Most big games are made for consoles first in mind and the PS5 has 16 GB VRAM, minus 4 GB for the OS, and games expect your GPU to have at least 12 GB VRAM.
Running local LLMs / AI has been a thing for a few years now, using llama.cpp and its webUI is all you need. A LLM can be fully loaded into the GPU's VRAM or, if the LLM can't fit, parts of it can be offloaded to system RAM. This laptop has 32 GB RAM + 8 GB VRAM. Small and better capable, big open-weights LLMs exist and the more RAM+VRAM your PC has, the better. Every GB helps. So, from 8 GB to 12 GB to 16 GB VRAM would already be a good to very good improvement.
Title: Re: Lenovo ThinkPad P1 16 Gen 8 review: Tandem OLED series premiere
Post by: davidm on December 11, 2025, 12:52:37
"AIDA64 RAM benchmark scores are notably very good" when is notebookcheck going to do the PC industry a favour and stop qualifying. Any Strix Halo system will have RAM nearly twice as fast, not to mention Macs. That is a huge deal for many typical workstation operations.
Title: Re: Lenovo ThinkPad P1 16 Gen 8 review: Tandem OLED series premiere
Post by: nick23 on December 11, 2025, 14:36:16
Quote
Quotematte tandem OLED can appear slightly grainy up close

How to destroy the beautiful popping colors and the sharp text of an (glossy) OLED? Take grinding paper/stone and rub it on the screen, then you get the screen in this laptop.

In a glossy OLED those beautiful popping colors and sharp text get utterly obliterated by reflections of any light source or window behind you, and you end up constantly fidgeting with the laptop position or trying to find a dark corner to sit in. I understand glossy screens for a desktop computer where you can control the ambient lighting, but not in a laptop.

Also, whenever the screen is dark, e.g. a dark theme of your code editor or a dark movie scene, you're constantly staring at your own face. Are you really that pretty?

It's precisely the matte tandem OLED that is the main attraction of this Thinkpad for me, and I'll happily take minimal grain to avoid strong reflections.
Title: Re: Lenovo ThinkPad P1 16 Gen 8 review: Tandem OLED series premiere
Post by: veraverav on December 11, 2025, 19:24:06
Quote from: 2k for 8GB VRAM gg on December 11, 2025, 10:08:06
Quote8 GB VRAM
2000 for only 8 GB VRAM? Nice trolling.
Even games have a problem with only 8 GB VRAM: youtube.com/watch?v=ric7yb1VaoA: "Gaming Laptops are in Trouble - VRAM Testing w/ ‪@Hardwareunboxed‬"
Most big games are made for consoles first in mind and the PS5 has 16 GB VRAM, minus 4 GB for the OS, and games expect your GPU to have at least 12 GB VRAM.
Running local LLMs / AI has been a thing for a few years now, using llama.cpp and its webUI is all you need. A LLM can be fully loaded into the GPU's VRAM or, if the LLM can't fit, parts of it can be offloaded to system RAM. This laptop has 32 GB RAM + 8 GB VRAM. Small and better capable, big open-weights LLMs exist and the more RAM+VRAM your PC has, the better. Every GB helps. So, from 8 GB to 12 GB to 16 GB VRAM would already be a good to very good improvement.

But its not really meant to be a gaming laptop. Would plugging in an eGPU resolve this bottleneck for someone that absolutely has to game?
Title: Re: Lenovo ThinkPad P1 16 Gen 8 review: Tandem OLED series premiere
Post by: Worgarthe on December 12, 2025, 00:31:26
Quote from: 2k for 8GB VRAM gg on December 11, 2025, 10:08:06Running local LLMs / AI has been a thing for a few years now, using llama.cpp and its webUI is all you need. A LLM can be fully loaded into the GPU's VRAM or, if the LLM can't fit, parts of it can be offloaded to system RAM. This laptop has 32 GB RAM + 8 GB VRAM. Small and better capable, big open-weights LLMs exist and the more RAM+VRAM your PC has, the better. Every GB helps. So, from 8 GB to 12 GB to 16 GB VRAM would already be a good to very good improvement.
Genuinely curious - do you do anything else in your life apart from running local LLMs? Well, aside from spamming the same bs under quite literally every single existing review around here...

Quote from: veraverav on December 11, 2025, 19:24:06Would plugging in an eGPU resolve this bottleneck for someone that absolutely has to game?
Yes it would, and it's cheaper too (as opposed to getting an absolute top specs laptop with its insane price tag; talking in general here about laptops, not about the P1 G8 which tops at 8 GB VRAM).

I have an RTX 5070 Ti (16 GB) working perfectly fine with both of my ThinkPads (X1 Carbon and P16). There's a bit of bottleneck if you're chasing super-high fps (for example, if you can get, say, 350 in a desktop you won't really reach more than 300 here), but if you cap that to 60-165 fps there really is no difference in gaming experience and you save significant money in the process too (while getting more raw GPU power). That's a 780-ish € GPU, and it will completely stomp both the Blackwells here in this P1 (PRO 1000 and PRO 2000), even with the mentioned Thunderbolt bottleneck (which is marginal outside of games and very high fps).

The tradeoff is not being able to play ultra graphics on the go because some (homeless?) people apparently do that all day long, they simply travel and play continuously without doing anything else (apart from crying how 8 GB VRAM is insufficient for games), and those same people struggle to open their in-game settings to put textures to high instead of ultra to significantly lower their VRAM usage, but other than that - no complaints, all works flawlessly when I get home - I put my laptop(s) on a table, plug in a single cable and that's it. The eGPU gets activated automatically, and I can play immediately.
Title: Re: Lenovo ThinkPad P1 16 Gen 8 review: Tandem OLED series premiere
Post by: veraverav on December 12, 2025, 01:52:52
Quote from: Worgarthe on December 12, 2025, 00:31:26Yes it would, and it's cheaper too (as opposed to getting an absolute top specs laptop with its insane price tag; talking in general here about laptops, not about the P1 G8 which tops at 8 GB VRAM).

I have an RTX 5070 Ti (16 GB) working perfectly fine with both of my ThinkPads (X1 Carbon and P16). There's a bit of bottleneck if you're chasing super-high fps (for example, if you can get, say, 350 in a desktop you won't really reach more than 300 here), but if you cap that to 60-165 fps there really is no difference in gaming experience and you save significant money in the process too (while getting more raw GPU power). That's a 780-ish € GPU, and it will completely stomp both the Blackwells here in this P1 (PRO 1000 and PRO 2000), even with the mentioned Thunderbolt bottleneck (which is marginal outside of games and very high fps).

The tradeoff is not being able to play ultra graphics on the go because some (homeless?) people apparently do that all day long, they simply travel and play continuously without doing anything else (apart from crying how 8 GB VRAM is insufficient for games), and those same people struggle to open their in-game settings to put textures to high instead of ultra to significantly lower their VRAM usage, but other than that - no complaints, all works flawlessly when I get home - I put my laptop(s) on a table, plug in a single cable and that's it. The eGPU gets activated automatically, and I can play immediately.

Thanks for this info! I have a P1 Gen 2 and I can't remember when I played a game on it, ever. If I am going to game, it will be at my desk, so I could just use a eGPU. I actually wanted the P1 Gen 8 without a dGPU but the 45% off was only on prebuilt machines with the dGPU. I'll buy an eGPU if I ever decide I have time for games :)

Title: Re: Lenovo ThinkPad P1 16 Gen 8 review: Tandem OLED series premiere
Post by: Worgarthe on December 12, 2025, 04:44:47
Quote from: veraverav on December 12, 2025, 01:52:52Thanks for this info!
🫡

Quote from: veraverav on December 12, 2025, 01:52:52I have a P1 Gen 2 and I can't remember when I played a game on it, ever. If I am going to game, it will be at my desk, so I could just use a eGPU. I actually wanted the P1 Gen 8 without a dGPU but the 45% off was only on prebuilt machines with the dGPU. I'll buy an eGPU if I ever decide I have time for games :)
Exactly this! I really don't understand the obsession with "VRAM for gaming" (in laptops), when in reality even if there was 48 GB of it - who exactly would game on the go on battery power? And get not just reduced GPU performance, but also like 40 minutes of battery life. True gaming, yeah...
...not 😑

Laptops are awesome to carry them when you run & gun around, but when you get home you dock them (and/or plug into a nice eGPU setup) and get basically a desktop-like experience while still having full mobility if and when you need to pack and go. For less money than maxing a GPU in a laptop. Heck, it's even possible to have multiple GPUs in an eGPU setup, if one wants to do some heavy local LLM stuff - still for cheaper and with more VRAM than going crazy with max specs of a laptop, hah!

Update:

Ok, so about this part:

Quote from: Worgarthe on December 12, 2025, 04:44:47Heck, it's even possible to have multiple GPUs in an eGPU setup, if one wants to do some heavy local LLM stuff - still for cheaper and with more VRAM than going crazy with max specs of a laptop, hah!

I just checked prices in Germany, but for the P16 Gen 3 because it is possible to equip it with up to 24 GB VRAM (Blackwell 5000).

The base version 8 GB (Blackwell 1000) config goes for 2819 € currently (https://www.lenovo.com/de/de/configurator/cto/index.html?bundleId=21RQCTO1WWDE1). Prices for GPU upgrades are the following:


The rest of the specs is untouched from the base config, so 245HX + 16 GB RAM + 512 GB SSD, with only the display being automatically improved to a 2400p panel (no option to keep the base 1200p panel with that GPU).

So I just added that GPU and literally nothing else, the laptop is now 6039€, that's an increase of +3220€ (!!) just because of 24 GB GPU!

Let's see how much VRAM we can get with 3220€, to put that in an eGPU setup while keeping the P16 Gen 3 at its base price: https://www.idealo.de/preisvergleich/OffersOfProduct/205942083_-geforce-rtx-5070-ti-gigabyte.html (https://www.idealo.de/preisvergleich/OffersOfProduct/205942083_-geforce-rtx-5070-ti-gigabyte.html)

3220/759=4. So four RTX 5070 Ti. Meaning 64 GB of VRAM. Meaning 40 GB more, for the same price. The 5000 Blackwell is similar in performance as a 4070 Super and 3080 Ti (https://www.techpowerup.com/gpu-specs/rtx-pro-5000-blackwell-embedded.c4280). The 5070 Ti is simply far ahead performance-wise, and with four of them at full power of 300W each - it's even more tragic to compare of what you get for the same amount of money... ¯\_(ツ)_/¯
Title: Re: Lenovo ThinkPad P1 16 Gen 8 review: Tandem OLED series premiere
Post by: pascal76 on December 12, 2025, 10:39:18
FYI, I found this comment in a YT video

"I ordered a P1 Gen 8 on release with a 265h, 64gb ram and no dedicated GPU because I wanted best battery life and lowest noise on this machine. Sent it back after a day, the fan is not just "you notice it", it's atrociously loud. Even in "everyday tasks" or on idle the fan kicks in all the time, extremely annoying."
Title: Re: Lenovo ThinkPad P1 16 Gen 8 review: Tandem OLED series premiere
Post by: anzej on December 13, 2025, 15:43:17
I'd strongly recommend choosing a model with the dGPU, even if you don't need it. The non-dGPU models come with a smaller heatsink without liquid metal that fails to cover all the VRMs. This causes VRM overheating, which severely limits CPU package power.

You can verify this yourself by comparing spare part images for 5H41R89131 (non-dGPU) versus 5H41R89134 (dGPU).

I experienced this exact problem on my P1 G5 without a dGPU, as it could only sustain 38W CPU package power long-term due to VRM thermal issues. Lenovo Premier Support confirmed the design flaw and offered a full refund, explicitly recommending I purchase a dGPU-equipped unit instead.

TL;DR: Get the dGPU model regardless of your graphics needs if you want to get decent sustained CPU performance.
Title: Re: Lenovo ThinkPad P1 16 Gen 8 review: Tandem OLED series premiere
Post by: Pascal76 on December 13, 2025, 23:28:14
well I bought the P14s gen 6 intel with no dGPU & 265h CPU => no noise :) ... but 14.5p screen :(
Title: Re: Lenovo ThinkPad P1 16 Gen 8 review: Tandem OLED series premiere
Post by: veraverav on December 13, 2025, 23:51:21
Quote from: pascal76 on December 12, 2025, 10:39:18FYI, I found this comment in a YT video

"I ordered a P1 Gen 8 on release with a 265h, 64gb ram and no dedicated GPU because I wanted best battery life and lowest noise on this machine. Sent it back after a day, the fan is not just "you notice it", it's atrociously loud. Even in "everyday tasks" or on idle the fan kicks in all the time, extremely annoying."


"after a day" .. I'm guessing updates were running in the background or something or the slider was set to "best performance" which pegs the CPU.

There is a great report on reddit with someone with the 265H w/ dGPU and OLED with like 12hrs of mixed use on battery.
Title: Re: Lenovo ThinkPad P1 16 Gen 8 review: Tandem OLED series premiere
Post by: veraverav on December 13, 2025, 23:52:50
Quote from: anzej on December 13, 2025, 15:43:17I'd strongly recommend choosing a model with the dGPU, even if you don't need it. The non-dGPU models come with a smaller heatsink without liquid metal that fails to cover all the VRMs. This causes VRM overheating, which severely limits CPU package power.

You can verify this yourself by comparing spare part images for 5H41R89131 (non-dGPU) versus 5H41R89134 (dGPU).

I experienced this exact problem on my P1 G5 without a dGPU, as it could only sustain 38W CPU package power long-term due to VRM thermal issues. Lenovo Premier Support confirmed the design flaw and offered a full refund, explicitly recommending I purchase a dGPU-equipped unit instead.

TL;DR: Get the dGPU model regardless of your graphics needs if you want to get decent sustained CPU performance.

Didn't know this but glad I didn't go for the iGPU only model. Discount of 45%+ was on the dGPU model, so Lenovo made the correct choice for me :) Thanks for posting this. Learned something new.
Title: Re: Lenovo ThinkPad P1 16 Gen 8 review: Tandem OLED series premiere
Post by: pelican-freeze on December 14, 2025, 14:48:36
Are there any issues with the non-standard 3.2K resolution on a laptop intended for workstation use? I like the idea of a tandem OLED, but at 3.2K it seems like it might be more of a headache than anything else. At 1440p you get better battery life, cleaner application compatibility, and more flexibility for some light gaming, while at 3.2K you lose the battery life benefits of a lower resolution and risk app compatibility issues, without getting the content-consumption advantages of a proper 4K panel.

If the appeal of a tandem OLED workstation is being able to watch 4K HDR Netflix in your downtime, any sub‑4K display is still going to get the same 1080p stream (just like an iPad or Android tablet). To get true 4K HDR streaming you need a full 4K, HDCP 2.2‑compliant display; otherwise, you're capped at 1080p. So a 3.2k tandem OLED is objectively a poor solution for content consumption.

I'm interested in this laptop, but I'm concerned about weird interface scaling in professional apps that almost certainly haven't been tested for usability at 3200×2000. And having a tandem OLED that isn't suited for content consumption just feels like an awkward choice when Lenovo already offers such bright IPS panels - like with the 3.2k tandem OLED you get poor battery life and poor content consumption, but without a material increase in brightness to improve display visibility when in bright environments on the go since the P1 G8's 4k IPS panel option already offers 800 nits of SDR brightness. In practical SDR use the IPS panel is likely to be even brighter and offer even better visibility?

Like I'm a huge P1 / X1 Extreme fan and was excited about tandem OLED, but I can't figure out any use case where this 3.2k display makes sense since it appears to offer a big bag of frustrating compromises without any material benefits - bad for app compatibility, too high resolution for good battery life, too high for gaming, not high enough for good content consumption, no advantage over bright IPS option for outdoor use etc..

For those considering the P16 G3: you can configure CPUs/GPUs that, on paper, should scale to higher performance with more wattage, but the AC adapter is only 180W (vs. the P1 G8's 140W). That strongly suggests the P16 G3 chassis and power delivery are not designed to let parts like Intel HX CPUs or 5070 Ti/5080/5090‑class GPUs (RTX Pro 3000/4000/5000) run anywhere near their maximum performance. If your main interest in 5080/5090‑class GPUs is the extra NVENC encoders, the P16 G3 still looks great—much lighter and more compact than the P16 G2, with the tradeoff of lower cooling capacity, a smaller total power envelope, and reduced peak performance.

The only true 16‑inch performance workstation left this year is the Dell Pro Max 16 Plus (not Premium), since it ships with a 280W AC adapter and a much more performance‑oriented triple‑fan, vapor‑chamber cooling system. So if you care primarily about performance, the sweet spot is the Dell Pro Max 16 Plus with a 275HX (24 threads, fed full wattage) and a 5070 Ti‑class GPU (RTX 3000, at a full 175W), with the higher GPU tiers really only making sense for heavy video work. It's definitely not as compact as the P16 G3 and it's about a pound heavier once you factor in the adapter, but that seems like a very reasonable trade for the additional performance considering that there isn't really any competition at that mobile performance tier anymore.
Title: Re: Lenovo ThinkPad P1 16 Gen 8 review: Tandem OLED series premiere
Post by: pelican-freeze on December 14, 2025, 15:28:27
Quote from: veraverav on December 11, 2025, 19:24:06
Quote from: 2k for 8GB VRAM gg on December 11, 2025, 10:08:06
Quote8 GB VRAM
2000 for only 8 GB VRAM? Nice trolling.
Even games have a problem with only 8 GB VRAM: youtube.com/watch?v=ric7yb1VaoA: "Gaming Laptops are in Trouble - VRAM Testing w/ ‪@Hardwareunboxed‬"
Most big games are made for consoles first in mind and the PS5 has 16 GB VRAM, minus 4 GB for the OS, and games expect your GPU to have at least 12 GB VRAM.
Running local LLMs / AI has been a thing for a few years now, using llama.cpp and its webUI is all you need. A LLM can be fully loaded into the GPU's VRAM or, if the LLM can't fit, parts of it can be offloaded to system RAM. This laptop has 32 GB RAM + 8 GB VRAM. Small and better capable, big open-weights LLMs exist and the more RAM+VRAM your PC has, the better. Every GB helps. So, from 8 GB to 12 GB to 16 GB VRAM would already be a good to very good improvement.

But its not really meant to be a gaming laptop. Would plugging in an eGPU resolve this bottleneck for someone that absolutely has to game?

Plugging in an eGPU wouldn't resolve the issue as the P1 G8's CPU options are all low wattage chips configured with a maximum of 6 performance cores. For an eGPU to make sense on the Intel side you'd need at least a 255HX (8P+12E), or, on the AMD side a chip like a Ryzen AI 9 HX 370 (8P+4E). It's hard to overstate the degree to which lower power mobile CPUs right now are not a good fit for gaming - mobile GPUs are just cut down so dramatically from their desktop GPU counterparts though that this is less of an issue when pairing mobile CPUs with mobile GPUs. Pairing mobile CPUs with desktop GPUs you're going to run into serious bottlenecks if you don't pick the right chip - at least 8 performance cores, more watts, more performance scaling ability when feds those extra watts etc..

TLDR: If you want the option to use an eGPU with a thinkpad workstation buy a P16 Gen 3 with a 255HX (or higher). The P16 Gen 3 *should* offer even better CPU performance when it doesn't need to allocate thermal headroom to cooling the internal discreet GPU. This would be the best fit - high wattage desktop-ish class CPU actually running at reasonably high / close to max wattage in the laptop itself, eGPU also running at reasonably high / close to max wattage outside the laptop chassis.
Title: Re: Lenovo ThinkPad P1 16 Gen 8 review: Tandem OLED series premiere
Post by: Worgarthe on December 14, 2025, 16:34:16
Quote from: pelican-freeze on December 14, 2025, 14:48:36Are there any issues with the non-standard 3.2K resolution on a laptop intended for workstation use? I like the idea of a tandem OLED, but at 3.2K it seems like it might be more of a headache than anything else. At 1440p you get better battery life, cleaner application compatibility, and more flexibility for some light gaming, while at 3.2K you lose the battery life benefits of a lower resolution and risk app compatibility issues, without getting the content-consumption advantages of a proper 4K panel.
There is absolutely nothing wrong with this resolution (or any other). Scaling exist for that exact reason. Increased power consumption comes from the tech, not a resolution - a 16" 1200p 120 Hz OLED is going to consume far more power than a 16" 2400p 120 Hz IPS, for example.

Quote from: pelican-freeze on December 14, 2025, 15:28:27Plugging in an eGPU wouldn't resolve the issue as the P1 G8's CPU options are all low wattage chips configured with a maximum of 6 performance cores. For an eGPU to make sense on the Intel side you'd need at least a 255HX (8P+12E), or, on the AMD side a chip like a Ryzen AI 9 HX 370 (8P+4E). It's hard to overstate the degree to which lower power mobile CPUs right now are not a good fit for gaming - mobile GPUs are just cut down so dramatically from their desktop GPU counterparts though that this is less of an issue when pairing mobile CPUs with mobile GPUs. Pairing mobile CPUs with desktop GPUs you're going to run into serious bottlenecks if you don't pick the right chip - at least 8 performance cores, more watts, more performance scaling ability when feds those extra watts etc..

TLDR: If you want the option to use an eGPU with a thinkpad workstation buy a P16 Gen 3 with a 255HX (or higher). The P16 Gen 3 *should* offer even better CPU performance when it doesn't need to allocate thermal headroom to cooling the internal discreet GPU. This would be the best fit - high wattage desktop-ish class CPU actually running at reasonably high / close to max wattage in the laptop itself, eGPU also running at reasonably high / close to max wattage outside the laptop chassis.
You knowledge about eGPU and how eGPU work is clearly somewhere between zero and nothing, nhf. Literally this whole comment is wrong.

Check egpu.io for more info and to learn something instead of being confidently wrong: https://egpu.io/best-external-graphics-card-builds/ (https://egpu.io/best-external-graphics-card-builds/)

Or r/eGPU: https://www.reddit.com/r/eGPU/ (https://www.reddit.com/r/eGPU/)

Or a 30 Watt CPU handheld - Legion Go with eGPU (fairly popular combo): https://www.youtube.com/watch?v=5opYdgDtK0s (https://www.youtube.com/watch?v=5opYdgDtK0s)

Or, heck, even an 5-year old ThinkPad with a 15W CPU (with UHD 620 integrated graphics) + RTX 4090: https://www.youtube.com/watch?v=EOsGqAeyCtA (https://www.youtube.com/watch?v=EOsGqAeyCtA)

Or... Basically I can keep going endlessly.

Tl; dr - the truth is polar opposite of your comment.

Edit: Forgot to include this, from Notebookcheck:

X1 Carbon Gen 6 (15W CPU) tested with eGPU, back in 2018: https://www.notebookcheck.net/Lenovo-ThinkPad-X1-Carbon-2018-WQHD-HDR-i7-Laptop-Review.284682.0.html (https://www.notebookcheck.net/Lenovo-ThinkPad-X1-Carbon-2018-WQHD-HDR-i7-Laptop-Review.284682.0.html)
Title: Re: Lenovo ThinkPad P1 16 Gen 8 review: Tandem OLED series premiere
Post by: pelican-freeze on December 14, 2025, 19:00:29
Quote from: Worgarthe on December 14, 2025, 16:34:16
Quote from: pelican-freeze on December 14, 2025, 14:48:36Are there any issues with the non-standard 3.2K resolution on a laptop intended for workstation use? I like the idea of a tandem OLED, but at 3.2K it seems like it might be more of a headache than anything else. At 1440p you get better battery life, cleaner application compatibility, and more flexibility for some light gaming, while at 3.2K you lose the battery life benefits of a lower resolution and risk app compatibility issues, without getting the content-consumption advantages of a proper 4K panel.
There is absolutely nothing wrong with this resolution (or any other). Scaling exist for that exact reason. Increased power consumption comes from the tech, not a resolution - a 16" 1200p 120 Hz OLED is going to consume far more power than a 16" 2400p 120 Hz IPS, for example.

Quote from: pelican-freeze on December 14, 2025, 15:28:27Plugging in an eGPU wouldn't resolve the issue as the P1 G8's CPU options are all low wattage chips configured with a maximum of 6 performance cores. For an eGPU to make sense on the Intel side you'd need at least a 255HX (8P+12E), or, on the AMD side a chip like a Ryzen AI 9 HX 370 (8P+4E). It's hard to overstate the degree to which lower power mobile CPUs right now are not a good fit for gaming - mobile GPUs are just cut down so dramatically from their desktop GPU counterparts though that this is less of an issue when pairing mobile CPUs with mobile GPUs. Pairing mobile CPUs with desktop GPUs you're going to run into serious bottlenecks if you don't pick the right chip - at least 8 performance cores, more watts, more performance scaling ability when feds those extra watts etc..

TLDR: If you want the option to use an eGPU with a thinkpad workstation buy a P16 Gen 3 with a 255HX (or higher). The P16 Gen 3 *should* offer even better CPU performance when it doesn't need to allocate thermal headroom to cooling the internal discreet GPU. This would be the best fit - high wattage desktop-ish class CPU actually running at reasonably high / close to max wattage in the laptop itself, eGPU also running at reasonably high / close to max wattage outside the laptop chassis.
You knowledge about eGPU and how eGPU work is clearly somewhere between zero and nothing, nhf. Literally this whole comment is wrong.

Check egpu.io for more info and to learn something instead of being confidently wrong: https://egpu.io/best-external-graphics-card-builds/ (https://egpu.io/best-external-graphics-card-builds/)

Or r/eGPU: https://www.reddit.com/r/eGPU/ (https://www.reddit.com/r/eGPU/)

Or a 30 Watt CPU handheld - Legion Go with eGPU (fairly popular combo): https://www.youtube.com/watch?v=5opYdgDtK0s (https://www.youtube.com/watch?v=5opYdgDtK0s)

Or, heck, even an 5-year old ThinkPad with a 15W CPU (with UHD 620 integrated graphics) + RTX 4090: https://www.youtube.com/watch?v=EOsGqAeyCtA (https://www.youtube.com/watch?v=EOsGqAeyCtA)

Or... Basically I can keep going endlessly.

Tl; dr - the truth is polar opposite of your comment.

Edit: Forgot to include this, from Notebookcheck:

X1 Carbon Gen 6 (15W CPU) tested with eGPU, back in 2018: https://www.notebookcheck.net/Lenovo-ThinkPad-X1-Carbon-2018-WQHD-HDR-i7-Laptop-Review.284682.0.html (https://www.notebookcheck.net/Lenovo-ThinkPad-X1-Carbon-2018-WQHD-HDR-i7-Laptop-Review.284682.0.html)


Honestly, just read what I wrote? Nothing in your comment appears to be a direct response to anything I said. Obviously you can game with lower wattage CPUs and eGPUs - you'll just get worse performance. And obviously if you want to avoid your eGPU being performance limited / bottlenecked by your CPU choice you can just choose a CPU that is a better fit for gaming.

Spending $$$$ on a high wattage desktop GPU that you know will be performance constrained by a low wattage mobile CPU is clearly not going to provide the best experience but no one is stopping you from doing it if you want. Also, you can't put a higher wattage mobile CPU intended for gaming use (255HX+) into a 4 lb ultraportable laptop so if your goal is "ultraportable laptop that can play games at home" that rules out the higher wattage enthusiast class CPUs that are designed for gaming / workstation laptop use.

So there's nothing stopping you from using an eGPU / mobile CPU combo where the CPU is the performance bottleneck. It's clearly not possible to argue though that the best mobile CPU choices for gaming aren't the higher wattage mobile CPUs *explicitly* designed and marketed by Intel / AMD as mobile gaming CPUs.
Title: Re: Lenovo ThinkPad P1 16 Gen 8 review: Tandem OLED series premiere
Post by: Worgarthe on December 14, 2025, 19:37:09
Quote from: pelican-freeze on December 14, 2025, 19:00:29Honestly, just read what I wrote? Nothing in your comment appears to be a direct response to anything I said. Obviously you can game with lower wattage CPUs and eGPUs - you'll just get worse performance. And obviously if you want to avoid your eGPU being performance limited / bottlenecked by your CPU choice you can just choose a CPU that is a better fit for gaming.
Worse performance, yet still significantly better than what you get with gaming on iGPU or dGPU in that same laptop (if a GPU in an eGPU dock/enclosure is more powerful, naturally).

Quote from: pelican-freeze on December 14, 2025, 19:00:29Spending $$$$ on a high wattage desktop GPU that you know will be performance constrained by a low wattage mobile CPU is clearly not going to provide the best experience but no one is stopping you from doing it if you want. Also, you can't put a higher wattage mobile CPU intended for gaming use (255HX+) into a 4 lb ultraportable laptop so if your goal is "ultraportable laptop that can play games at home" that rules out the higher wattage enthusiast class CPUs that are designed for gaming / workstation laptop use.


Quote from: pelican-freeze on December 14, 2025, 19:00:29So there's nothing stopping you from using an eGPU / mobile CPU combo where the CPU is the performance bottleneck. It's clearly not possible to argue though that the best mobile CPU choices for gaming aren't the higher wattage mobile CPUs *explicitly* designed and marketed by Intel / AMD as mobile gaming CPUs.
The CPU is always a bottleneck, no matter of TDP. If you check any of the links above, for example that video with the Legion Go + 3070 eGPU, you will see that the CPU usage is basically identical in all scenarios - with or without eGPU; for example, as clearly demonstrated in the video, in Final Fantasy 7 Rebirth the CPU at 1080p low is at 21-24% to get 26 average fps with iGPU, yet once the 3070 eGPU is plugged in there is 75 average fps at 1440p high settings with the CPU usage at exactly the same 21-24%.

I have 5070 Ti, when I try to play Shadow of the Tomb Raider for example (I'm actually currently replaying the reboot series) with my X1 Carbon Gen 9 and its iGPU (i7 1165G7, Iris Xe) at absolutely lowest possible settings at 720p, I get barely 30 fps with 55-60% CPU usage. When I plug in my 5070 Ti I get 160+ fps at highest/maxed settings without Ray Tracing and 80-100 with Ray Tracing, all at 1440p, and the CPU is still at 55-60% usage.

Which is more than what I get with my P16 Gen 2 (i7 14700HX + ADA 3500 12 GB), where I get 110-120 at the same highest/maxed settings without RT, and about 55-70 with RT on. With eGPU plugged in to that same P16 Gen 2, the same 5070 Ti is pushing 170-ish at maxed settings, and around 100-110 with same settings + Ray Tracing on. All 1440p, of course.

Is a faster CPU faster than a slower CPU? Yes. Does wasting away insane amounts of money for top specs makes sense when a 750€-ish GPU is still going to obliterate those specs even with slightly reduced bandwidth due to its eGPU setup? Well, that's up to each person to decide how much they hate their own money ¯\_(ツ)_/¯

Again, for clarity:

Quote from: Worgarthe on December 12, 2025, 04:44:47I just checked prices in Germany, but for the P16 Gen 3 because it is possible to equip it with up to 24 GB VRAM (Blackwell 5000).

The base version 8 GB (Blackwell 1000) config goes for 2819 € currently (https://www.lenovo.com/de/de/configurator/cto/index.html?bundleId=21RQCTO1WWDE1). Prices for GPU upgrades are the following:

  • 2000 Blackwell (8 GB) +230€
  • 3000 Blackwell (12 GB) +790€
  • 4000 Blackwell (16 GB) +1420€
  • 5000 Blackwell (24 GB) +2980€ (😂)

The rest of the specs is untouched from the base config, so 245HX + 16 GB RAM + 512 GB SSD, with only the display being automatically improved to a 2400p panel (no option to keep the base 1200p panel with that GPU).

So I just added that GPU and literally nothing else, the laptop is now 6039€, that's an increase of +3220€ (!!) just because of 24 GB GPU!

Let's see how much VRAM we can get with 3220€, to put that in an eGPU setup while keeping the P16 Gen 3 at its base price: https://www.idealo.de/preisvergleich/OffersOfProduct/205942083_-geforce-rtx-5070-ti-gigabyte.html (https://www.idealo.de/preisvergleich/OffersOfProduct/205942083_-geforce-rtx-5070-ti-gigabyte.html)

3220/759=4. So four RTX 5070 Ti. Meaning 64 GB of VRAM. Meaning 40 GB more, for the same price. The 5000 Blackwell is similar in performance as a 4070 Super and 3080 Ti (https://www.techpowerup.com/gpu-specs/rtx-pro-5000-blackwell-embedded.c4280). The 5070 Ti is simply far ahead performance-wise, and with four of them at full power of 300W each - it's even more tragic to compare of what you get for the same amount of money... ¯\_(ツ)_/¯
Title: Re: Lenovo ThinkPad P1 16 Gen 8 review: Tandem OLED series premiere
Post by: pelican-freeze on December 14, 2025, 19:57:35
Quote from: Worgarthe on December 14, 2025, 19:37:09
Quote from: pelican-freeze on December 14, 2025, 19:00:29Honestly, just read what I wrote? Nothing in your comment appears to be a direct response to anything I said. Obviously you can game with lower wattage CPUs and eGPUs - you'll just get worse performance. And obviously if you want to avoid your eGPU being performance limited / bottlenecked by your CPU choice you can just choose a CPU that is a better fit for gaming.
Worse performance, yet still significantly better than what you get with gaming on iGPU or dGPU in that same laptop (if a GPU in an eGPU dock/enclosure is more powerful, naturally).

Quote from: pelican-freeze on December 14, 2025, 19:00:29Spending $$$$ on a high wattage desktop GPU that you know will be performance constrained by a low wattage mobile CPU is clearly not going to provide the best experience but no one is stopping you from doing it if you want. Also, you can't put a higher wattage mobile CPU intended for gaming use (255HX+) into a 4 lb ultraportable laptop so if your goal is "ultraportable laptop that can play games at home" that rules out the higher wattage enthusiast class CPUs that are designed for gaming / workstation laptop use.
  • https://www.notebookchat.com/index.php?topic=256048.msg694421#msg694421 (https://www.notebookchat.com/index.php?topic=256048.msg694421#msg694421)
  • https://www.notebookchat.com/index.php?topic=256048.msg694430#msg694430 (https://www.notebookchat.com/index.php?topic=256048.msg694430#msg694430)


Quote from: pelican-freeze on December 14, 2025, 19:00:29So there's nothing stopping you from using an eGPU / mobile CPU combo where the CPU is the performance bottleneck. It's clearly not possible to argue though that the best mobile CPU choices for gaming aren't the higher wattage mobile CPUs *explicitly* designed and marketed by Intel / AMD as mobile gaming CPUs.
The CPU is always a bottleneck, no matter of TDP. If you check any of the links above, for example that video with the Legion Go + 3070 eGPU, you will see that the CPU usage is basically identical in all scenarios - with or without eGPU; for example, as clearly demonstrated in the video, in Final Fantasy 7 Rebirth the CPU at 1080p low is at 21-24% to get 26 average fps with iGPU, yet once the 3070 eGPU is plugged in there is 75 average fps at 1440p high settings with the CPU usage at exactly the same 21-24%.

I have 5070 Ti, when I try to play Shadow of the Tomb Raider for example (I'm actually currently replaying the reboot series) with my X1 Carbon Gen 9 and its iGPU (i7 1165G7, Iris Xe) at absolutely lowest possible settings at 720p, I get barely 30 fps with 55-60% CPU usage. When I plug in my 5070 Ti I get 160+ fps at highest/maxed settings without Ray Tracing and 80-100 with Ray Tracing, all at 1440p, and the CPU is still at 55-60% usage.

Which is more than what I get with my P16 Gen 2 (i7 14700HX + ADA 3500 12 GB), where I get 110-120 at the same highest/maxed settings without RT, and about 55-70 with RT on. With eGPU plugged in to that same P16 Gen 2, the same 5070 Ti is pushing 170-ish at maxed settings, and around 100-110 with same settings + Ray Tracing on. All 1440p, of course.

Is a faster CPU faster than a slower CPU? Yes. Does wasting away insane amounts of money for top specs makes sense when a 750€-ish GPU is still going to obliterate those specs even with slightly reduced bandwidth due to its eGPU setup? Well, that's up to each person to decide how much they hate their own money ¯\_(ツ)_/¯

Again, for clarity:

Quote from: Worgarthe on December 12, 2025, 04:44:47I just checked prices in Germany, but for the P16 Gen 3 because it is possible to equip it with up to 24 GB VRAM (Blackwell 5000).

The base version 8 GB (Blackwell 1000) config goes for 2819 € currently (https://www.lenovo.com/de/de/configurator/cto/index.html?bundleId=21RQCTO1WWDE1). Prices for GPU upgrades are the following:

  • 2000 Blackwell (8 GB) +230€
  • 3000 Blackwell (12 GB) +790€
  • 4000 Blackwell (16 GB) +1420€
  • 5000 Blackwell (24 GB) +2980€ (😂)

The rest of the specs is untouched from the base config, so 245HX + 16 GB RAM + 512 GB SSD, with only the display being automatically improved to a 2400p panel (no option to keep the base 1200p panel with that GPU).

So I just added that GPU and literally nothing else, the laptop is now 6039€, that's an increase of +3220€ (!!) just because of 24 GB GPU!

Let's see how much VRAM we can get with 3220€, to put that in an eGPU setup while keeping the P16 Gen 3 at its base price: https://www.idealo.de/preisvergleich/OffersOfProduct/205942083_-geforce-rtx-5070-ti-gigabyte.html (https://www.idealo.de/preisvergleich/OffersOfProduct/205942083_-geforce-rtx-5070-ti-gigabyte.html)

3220/759=4. So four RTX 5070 Ti. Meaning 64 GB of VRAM. Meaning 40 GB more, for the same price. The 5000 Blackwell is similar in performance as a 4070 Super and 3080 Ti (https://www.techpowerup.com/gpu-specs/rtx-pro-5000-blackwell-embedded.c4280). The 5070 Ti is simply far ahead performance-wise, and with four of them at full power of 300W each - it's even more tragic to compare of what you get for the same amount of money... ¯\_(ツ)_/¯

I feel like you're making my point for me. Reading your post implies that the best option would be a P16 Gen 3 with a 255HX gaming / workstation CPU and the cheapest GPU option (1000 Blackwell) combined with a desktop 5070 TI in an eGPU setup?

At no point have I said that a high wattage eGPU setup won't provide better performance compared to a low wattage discreet mobile GPU, even when connected to an ultraportable laptop.

Also, if you primarily want to play older titles like Shadow of the Tomb Raider that were released before the current console generation than it goes without saying that they will be much less CPU intensive than current generation games. I can absolutely see the CPU bottleneck being less of an issue for gaming if you primarily play games released before 2020 / before the PlayStation 5 & Xbox Series X were both released with 8 performance CPU cores.

But why would you pay $$$$ for a desktop 5070 TI if you only wanted to play older games?
Title: Re: Lenovo ThinkPad P1 16 Gen 8 review: Tandem OLED series premiere
Post by: Worgarthe on December 14, 2025, 21:24:27
Quote from: pelican-freeze on December 14, 2025, 19:57:35I feel like you're making my point for me. Reading your post implies that the best option would be a P16 Gen 3 with a 255HX gaming / workstation CPU and the cheapest GPU option (1000 Blackwell) combined with a desktop 5070 TI in an eGPU setup?
The best option is any Oculink or Thunderbolt 5 laptop, currently there's not that many of them to pick. So yes, the P16 Gen 3 is one of those few, that's correct. The best one is the one with the best thermal "picture", because if it can't hold stable PL1 Wattage and frequencies it will be all over the place, which is why the Razer Blade 16 is probably quite meh as its CPU is like watching a heart monitor graph. Can't wait for an in-depth P16 Gen 3 review though!

Quote from: pelican-freeze on December 14, 2025, 19:57:35At no point have I said that a high wattage eGPU setup won't provide better performance compared to a low wattage discreet mobile GPU, even when connected to an ultraportable laptop.

Also, if you primarily want to play older titles like Shadow of the Tomb Raider that were released before the current console generation than it goes without saying that they will be much less CPU intensive than current generation games. I can absolutely see the CPU bottleneck being less of an issue for gaming if you primarily play games released before 2020 / before the PlayStation 5 & Xbox Series X were both released with 8 performance CPU cores.
Yes, you are correct with this, the part in bold especially. Six cores is proven to be enough for all games, in laptops, in desktops, perfectly enough. The 9600X is a monster of a CPU, yet inexpensive, and it's trailing just about 10% behind the 9800X3D which is more expensive but also slightly faster due to its fast cache (96 MB L3 vs 32 MB L3 in the 9600X). But if you compare the 9700X with its 8 cores to that same 9600X with its 6 cores - they are pretty much identical, down to a single fps, like getting 165 fps in Battlefield 6 with 9700X vs 164 fps with 9600X. If two more cores are needed to get just 1 extra fps...

Same applies to laptops - more cache is better than more cores, as long as those cores are 6 performance cores. TDP is insignificant as long as it's a fairly modern architecture with modern/fast IPC, and as long as it can hold PL1 without thermal throttling. You absolutely won't get any significant in-game difference (no matter of how intensive a game is) with a flat 30-35W PL1 6-8 core CPU vs a flat 90W PL1 20 core HX CPU. Again, if getting, for example, a 220 fps average with 98 1% lows is that much necessary over a 207 fps average with 96% 1% low to justify paying huge premium just to play games then that's up to you, but you simply won't notice any difference outside of benchmarks and Afterburner overlay. For other stuff, more important stuff outside of gaming, faster and more powerful CPU with as many cores as possible is clearly far better pick though, even with eGPU (local LLM, rendering, video production etc.), one can't argue against that. But for games six fast(er) cores is superior to, say, 8-12 slow(er) cores, any day, any game.

Quote from: pelican-freeze on December 14, 2025, 19:57:35But why would you pay $$$$ for a desktop 5070 TI if you only wanted to play older games?
I play many games just fine, actually just finished Doom The Dark Ages recently after getting it on a nice 50% discount on Black Friday. That's one of the most demanding games of 2025, "ran" it at a literal 2 fps on my X1 Carbon with its iGPU, but it was a decent 45-ish fps experience with 5070 Ti at 1440p DLSS Q. However, older CPU with just 4 cores with less IPC and lower clocks in general simply struggles to push more than that, even when added more Watts to it (PL1 set to 29W instead of 22). It was at 95-100% usage at 2 fps and 45-ish fps, btw 😁

My P16 G2 was running it natively (1600p DLSS Q) at about 43-48 fps, with its Ada 3500. Impressive? Meh, not really given the TDP, TGP and overall performance difference to my three years older X1 Carbon. But with 5070 Ti eGPU it was pushing 88-96 fps maxed settings 1440p without DLSS, and 210+ fps maxed 1440p with Frame Gen (didn't test FG on my X1C though). The 14700HX was at about 50% usage in both cases (Ada 3500 and eGPU 5070 Ti). That's basically identical performance as what you get with a 9800X3D + 5070 Ti desktop.

And to finish this rambling - the absolute top config for the P16 Gen 2 would come with an RTX 5000 (also Ada, not Blackwell). That would be insanely costly to get, like about twice more expensive than what I paid for mine, and it would still get obliterated in games by a much cheaper (~780€ when I bought it) 5070 Ti in an eGPU enclosure ¯\_(ツ)_/¯

To answer your question about older game(s)... Because two new Tomb Raider games were announced two days ago - https://store.steampowered.com/news/app/203160/view/559139291408630486 (https://store.steampowered.com/news/app/203160/view/559139291408630486) - and as a fan of the series I went to replay again the Rise (2016 game) and the Shadow (2018 game), while waiting for an upcoming 2026 game. I even replayed a whole Half Life 1 and Half Life 2 recently, because of those intensified Half Life 3 rumours, and those can run at some super-high fps on probably any Celeron or Atom with their weakest iGPUs 🙂
Title: Re: Lenovo ThinkPad P1 16 Gen 8 review: Tandem OLED series premiere
Post by: veraverav on December 14, 2025, 21:35:27
Quote from: pelican-freeze on December 14, 2025, 14:48:36Are there any issues with the non-standard 3.2K resolution on a laptop intended for workstation use? I like the idea of a tandem OLED, but at 3.2K it seems like it might be more of a headache than anything else. At 1440p you get better battery life, cleaner application compatibility, and more flexibility for some light gaming, while at 3.2K you lose the battery life benefits of a lower resolution and risk app compatibility issues, without getting the content-consumption advantages of a proper 4K panel.

If the appeal of a tandem OLED workstation is being able to watch 4K HDR Netflix in your downtime, any sub‑4K display is still going to get the same 1080p stream (just like an iPad or Android tablet). To get true 4K HDR streaming you need a full 4K, HDCP 2.2‑compliant display; otherwise, you're capped at 1080p. So a 3.2k tandem OLED is objectively a poor solution for content consumption.

I'm interested in this laptop, but I'm concerned about weird interface scaling in professional apps that almost certainly haven't been tested for usability at 3200×2000. And having a tandem OLED that isn't suited for content consumption just feels like an awkward choice when Lenovo already offers such bright IPS panels - like with the 3.2k tandem OLED you get poor battery life and poor content consumption, but without a material increase in brightness to improve display visibility when in bright environments on the go since the P1 G8's 4k IPS panel option already offers 800 nits of SDR brightness. In practical SDR use the IPS panel is likely to be even brighter and offer even better visibility?

Like I'm a huge P1 / X1 Extreme fan and was excited about tandem OLED, but I can't figure out any use case where this 3.2k display makes sense since it appears to offer a big bag of frustrating compromises without any material benefits - bad for app compatibility, too high resolution for good battery life, too high for gaming, not high enough for good content consumption, no advantage over bright IPS option for outdoor use etc..

For those considering the P16 G3: you can configure CPUs/GPUs that, on paper, should scale to higher performance with more wattage, but the AC adapter is only 180W (vs. the P1 G8's 140W). That strongly suggests the P16 G3 chassis and power delivery are not designed to let parts like Intel HX CPUs or 5070 Ti/5080/5090‑class GPUs (RTX Pro 3000/4000/5000) run anywhere near their maximum performance. If your main interest in 5080/5090‑class GPUs is the extra NVENC encoders, the P16 G3 still looks great—much lighter and more compact than the P16 G2, with the tradeoff of lower cooling capacity, a smaller total power envelope, and reduced peak performance.

The only true 16‑inch performance workstation left this year is the Dell Pro Max 16 Plus (not Premium), since it ships with a 280W AC adapter and a much more performance‑oriented triple‑fan, vapor‑chamber cooling system. So if you care primarily about performance, the sweet spot is the Dell Pro Max 16 Plus with a 275HX (24 threads, fed full wattage) and a 5070 Ti‑class GPU (RTX 3000, at a full 175W), with the higher GPU tiers really only making sense for heavy video work. It's definitely not as compact as the P16 G3 and it's about a pound heavier once you factor in the adapter, but that seems like a very reasonable trade for the additional performance considering that there isn't really any competition at that mobile performance tier anymore.

Actually the 3.2K screen run @ 200% scaling will yield a better picture than 4K @ 250% scaling.

So far I've read zero complaints about watching 4k content on a 3.2k screen or anything like that. I guess someone with this laptop can chime in if they are unable to get a 4k stream because its a 3.2k screen. Everyone seems to love the 3.2k OLED screen.

As for the eGPU, it simply to get around the 8GB VRAM bottleneck without having to haul around something as large and heavy as the P16.
Title: Re: Lenovo ThinkPad P1 16 Gen 8 review: Tandem OLED series premiere
Post by: Worgarthe on December 14, 2025, 21:47:40
Quote from: veraverav on December 14, 2025, 21:35:27Actually the 3.2K screen run @ 200% scaling will yield a better picture than 4K @ 250% scaling.
Correct. That's 1600x1000 (3.2K 200%) vs 1536x960 (4K 250%). Sure the 4K is a bit sharper but it's not really noticeable from any normal viewing distance. If one is touching the screen with their nose then they will probably notice some pixels here and there with a 3.2K, hehe. Another benefit of 3.2K is running it at 125% scaling, which is effectively the same real estate as 2560x1600, with 25% sharper image. 25% doesn't sound much, but if one works with text then it helps a lot as its easier on one's eyes no matter of how good their eyesight is.

Quote from: veraverav on December 14, 2025, 21:35:27So far I've read zero complaints about watching 4k content on a 3.2k screen or anything like that. I guess someone with this laptop can chime in if they are unable to get a 4k stream because its a 3.2k screen. Everyone seems to love the 3.2k OLED screen.
I don't have this exact panel, but I used to own a Surface Pro 8 for a few months with a 2880x1920, so not the same resolution but very close. No issues with 4K, it all worked great (replaced the Surf with an X1 Carbon 9 though).

Quote from: veraverav on December 14, 2025, 21:35:27As for the eGPU, it simply to get around the 8GB VRAM bottleneck without having to haul around something as large and heavy as the P16.
This 👍 You pay less for eGPU dock/enclosure + normal desktop GPU, and you get more performance (for far less money) than what you would get with that same laptop if its dGPU was maxed.

I mean, again, and this is now a rhetorical question - why in the world would one pay +2980€ to go from 245HX + Blackwell 1000 to 245HX + Blackwell 5000 (that previously linked P16 Gen 3 configuration on Lenovo's site) when a 740-780€ 5070 Ti is simply going to destroy that Blackwell 5000? ¯\_(ツ)_/¯
Sure if they simply NEED to play Battlefield 6 on battery when they commute to work then ok, go crazy and get that Blackwell 5000, or they can save money and go with Blackwell 4000 (16 GB) for "just" +1420€ (and get even more obliterated by a 740-780€ 5070 Ti), but yeah...
Title: Re: Lenovo ThinkPad P1 16 Gen 8 review: Tandem OLED series premiere
Post by: pelican-freeze on December 15, 2025, 18:48:37
Quote from: veraverav on December 14, 2025, 21:35:27
Quote from: pelican-freeze on December 14, 2025, 14:48:36Are there any issues with the non-standard 3.2K resolution on a laptop intended for workstation use? I like the idea of a tandem OLED, but at 3.2K it seems like it might be more of a headache than anything else. At 1440p you get better battery life, cleaner application compatibility, and more flexibility for some light gaming, while at 3.2K you lose the battery life benefits of a lower resolution and risk app compatibility issues, without getting the content-consumption advantages of a proper 4K panel.

If the appeal of a tandem OLED workstation is being able to watch 4K HDR Netflix in your downtime, any sub‑4K display is still going to get the same 1080p stream (just like an iPad or Android tablet). To get true 4K HDR streaming you need a full 4K, HDCP 2.2‑compliant display; otherwise, you're capped at 1080p. So a 3.2k tandem OLED is objectively a poor solution for content consumption.

I'm interested in this laptop, but I'm concerned about weird interface scaling in professional apps that almost certainly haven't been tested for usability at 3200×2000. And having a tandem OLED that isn't suited for content consumption just feels like an awkward choice when Lenovo already offers such bright IPS panels - like with the 3.2k tandem OLED you get poor battery life and poor content consumption, but without a material increase in brightness to improve display visibility when in bright environments on the go since the P1 G8's 4k IPS panel option already offers 800 nits of SDR brightness. In practical SDR use the IPS panel is likely to be even brighter and offer even better visibility?

Like I'm a huge P1 / X1 Extreme fan and was excited about tandem OLED, but I can't figure out any use case where this 3.2k display makes sense since it appears to offer a big bag of frustrating compromises without any material benefits - bad for app compatibility, too high resolution for good battery life, too high for gaming, not high enough for good content consumption, no advantage over bright IPS option for outdoor use etc..

For those considering the P16 G3: you can configure CPUs/GPUs that, on paper, should scale to higher performance with more wattage, but the AC adapter is only 180W (vs. the P1 G8's 140W). That strongly suggests the P16 G3 chassis and power delivery are not designed to let parts like Intel HX CPUs or 5070 Ti/5080/5090‑class GPUs (RTX Pro 3000/4000/5000) run anywhere near their maximum performance. If your main interest in 5080/5090‑class GPUs is the extra NVENC encoders, the P16 G3 still looks great—much lighter and more compact than the P16 G2, with the tradeoff of lower cooling capacity, a smaller total power envelope, and reduced peak performance.

The only true 16‑inch performance workstation left this year is the Dell Pro Max 16 Plus (not Premium), since it ships with a 280W AC adapter and a much more performance‑oriented triple‑fan, vapor‑chamber cooling system. So if you care primarily about performance, the sweet spot is the Dell Pro Max 16 Plus with a 275HX (24 threads, fed full wattage) and a 5070 Ti‑class GPU (RTX 3000, at a full 175W), with the higher GPU tiers really only making sense for heavy video work. It's definitely not as compact as the P16 G3 and it's about a pound heavier once you factor in the adapter, but that seems like a very reasonable trade for the additional performance considering that there isn't really any competition at that mobile performance tier anymore.

Actually the 3.2K screen run @ 200% scaling will yield a better picture than 4K @ 250% scaling.

So far I've read zero complaints about watching 4k content on a 3.2k screen or anything like that. I guess someone with this laptop can chime in if they are unable to get a 4k stream because its a 3.2k screen. Everyone seems to love the 3.2k OLED screen.

As for the eGPU, it simply to get around the 8GB VRAM bottleneck without having to haul around something as large and heavy as the P16.


It's probably worth taking a moment to understand how 4k video streaming works. No mainstream streaming service will send a 4K video stream unless the device reports a 4K, HDCP 2.2 compliant display; anything below that gets a 1080p stream for DRM reasons.

The 3.2K panel will absolutely stream 1080p and may even get higher‑bitrate 1080p with HDR, but it cannot be served a true 4K stream. It simply isn't a 4K, HDCP 2.2 compliant display. That said 1080p on the 3.2k display can still be watchable - similar to how 1080p can look decent on a newer iPad Pro (another sub 4k tandem OLED panel) but it will never be actual 4K.

So with the P1 G8's display options you basically choose between:


There's no workaround where a service sends 4K and you downscale it to 3.2K; the 3.2k display simply doesn't qualify for 4K streaming video playback under the DRM rules. You can, however, plug the P1 G8 with the 3.2K screen into an external 4K, HDCP 2.2–compliant monitor or TV and then stream full 4K on that external display.

Hopefully that makes sense.


EDIT: Disney+ is the one real outlier that I'm aware of here. Right now Disney+ doesn't serve 4K to PCs at all - no 4K in a browser and no 4K in the Windows app - 4K streams only get served to TVs and video streaming sticks.

So, since both the 3.2K tandem OLED and the 4K IPS panels would get the same 1080p stream from Disney+, that particular service will actually look better on the 3.2K tandem OLED thanks to its contrast and HDR. The only caveat is that Disney could change its platform policy at any time, because there's no technical reason preventing Disney+ from serving 4K video to the 4K IPS display in the future.
Title: Re: Lenovo ThinkPad P1 16 Gen 8 review: Tandem OLED series premiere
Post by: pelican-freeze on December 15, 2025, 23:16:13
Quote from: Worgarthe on December 14, 2025, 21:47:40I mean, again, and this is now a rhetorical question - why in the world would one pay +2980€ to go from 245HX + Blackwell 1000 to 245HX + Blackwell 5000 (that previously linked P16 Gen 3 configuration on Lenovo's site) when a 740-780€ 5070 Ti is simply going to destroy that Blackwell 5000? ¯\_(ツ)_/¯
Sure if they simply NEED to play Battlefield 6 on battery when they commute to work then ok, go crazy and get that Blackwell 5000, or they can save money and go with Blackwell 4000 (16 GB) for "just" +1420€ (and get even more obliterated by a 740-780€ 5070 Ti), but yeah...

FWIW there's a much bigger difference between the higher-end mobile GPU options for everything that isn't gaming, so it can absolutely make sense to spend more on P16 Gen 3 GPU upgrades even if the chassis is TGP‑limited.

RTX Pro 2000 / 5060 mobile – 1 NVENC encoder, 128‑bit bus, 8 GB VRAM
RTX Pro 3000 / 5070 Ti mobile – 1 NVENC encoder, 192‑bit bus, 12 GB VRAM
RTX Pro 4000 / 5080 mobile – 2 NVENC encoders, 256‑bit bus, 16 GB VRAM
RTX Pro 5000 / 5090 mobile – 3 NVENC encoders, 256‑bit bus, 24 GB VRAM

For video editing specifically, the RTX Pro 4000 mobile is the real workstation sweet spot: it delivers desktop 5080‑class encoding (100% more NVENC encoders vs the RTX Pro 3000 / 5070 Ti mobile) while staying relatively reasonable on cost and power.

The RTX Pro 5000 / 5090 mobile can then give you essentially "5090‑class" video encoding throughput, since it has the same three NVENC encoders as the desktop 5090, but for the big price jump over the RTX Pro 4000 mobile you're only getting 50% more NVENC encoders and there's zero further improvement to the memory bus.

All of these mobile parts use the same single NVDEC decoder, so you don't gain extra decode hardware by moving up the stack. On desktop, the 5080 and 5090 each have two NVDEC decoders, which can help with heavier multi‑track timeline video decode. The desktop 5070 Ti desktop by comparison only has a single NVENC encoder / NVDEC decoder so it's a bit limited as a workstation GPU.
Title: Re: Lenovo ThinkPad P1 16 Gen 8 review: Tandem OLED series premiere
Post by: Worgarthe on December 17, 2025, 22:36:43
Quote from: pelican-freeze on December 15, 2025, 23:16:13FWIW there's a much bigger difference between the higher-end mobile GPU options for everything that isn't gaming, so it can absolutely make sense to spend more on P16 Gen 3 GPU upgrades even if the chassis is TGP‑limited.

RTX Pro 2000 / 5060 mobile – 1 NVENC encoder, 128‑bit bus, 8 GB VRAM
RTX Pro 3000 / 5070 Ti mobile – 1 NVENC encoder, 192‑bit bus, 12 GB VRAM
RTX Pro 4000 / 5080 mobile – 2 NVENC encoders, 256‑bit bus, 16 GB VRAM
RTX Pro 5000 / 5090 mobile – 3 NVENC encoders, 256‑bit bus, 24 GB VRAM

For video editing specifically, the RTX Pro 4000 mobile is the real workstation sweet spot: it delivers desktop 5080‑class encoding (100% more NVENC encoders vs the RTX Pro 3000 / 5070 Ti mobile) while staying relatively reasonable on cost and power.

The RTX Pro 5000 / 5090 mobile can then give you essentially "5090‑class" video encoding throughput, since it has the same three NVENC encoders as the desktop 5090, but for the big price jump over the RTX Pro 4000 mobile you're only getting 50% more NVENC encoders and there's zero further improvement to the memory bus.

All of these mobile parts use the same single NVDEC decoder, so you don't gain extra decode hardware by moving up the stack. On desktop, the 5080 and 5090 each have two NVDEC decoders, which can help with heavier multi‑track timeline video decode. The desktop 5070 Ti desktop by comparison only has a single NVENC encoder / NVDEC decoder so it's a bit limited as a workstation GPU.
I knew many of those things, but not all of those things; fantastic comment, thank you! It's a great start to go into an interesting rabbit hole and learn more stuff, thanks for posting this 🙏

And yes, I agree with you about the "everything that isn't gaming" part! The whole discussion about gaming (and eGPU) started because of this exact comment though (as that guy is spamming the exact same crap under pretty much every existing review around here):

Quote from: 2k for 8GB VRAM gg on December 11, 2025, 10:08:06
Quote8 GB VRAM
2000 for only 8 GB VRAM? Nice trolling.
Even games have a problem with only 8 GB VRAM: youtube.com/watch?v=ric7yb1VaoA: "Gaming Laptops are in Trouble - VRAM Testing w/ ‪@Hardwareunboxed‬"
Most big games are made for consoles first in mind and the PS5 has 16 GB VRAM, minus 4 GB for the OS, and games expect your GPU to have at least 12 GB VRAM.
Title: Re: Lenovo ThinkPad P1 16 Gen 8 review: Tandem OLED series premiere
Post by: pelican-freeze on December 21, 2025, 16:27:08
I picked up a P1 Gen 8 with the 3.2K Tandem OLED during the Black Friday sale to try it out, since Lenovo's holiday return window means no restocking fees even on opened units.

It's turned out to be more of a mixed bag than expected, and at this point I'm leaning toward returning it.

I was looking for a quality‑of‑life upgrade from my P1 Gen 4 (4K IPS, 3080, 11950H) — something that closed the gap a bit with a 16‑inch MacBook Pro in display quality, battery life, chassis temps, and performance. Unfortunately, even fully loaded, the P1 Gen 8 just doesn't narrow that gap as much as I was hoping.

The P1 Gen 8's Tandem OLED is surprisingly good for productivity, and text clarity versus a 4K IPS panel is less of a downgrade than I expected. Windows at 125% scaling on the 3.2K panel gives you effectively the same usable workspace as 150% scaling on a 4K IPS panel.

The big problem is the matte coating, which largely kills the usual OLED black‑level and "infinite contrast" advantages for content consumption. On the plus side, for video you get excellent color (100% DCI‑P3, with BT.2020 coverage in the "good but not QD‑OLED" tier) and extremely high HDR highlight brightness. But those strengths are undermined by the washed‑out perceived contrast and elevated blacks you get from a matte OLED, which is exactly the opposite of why people want OLED in the first place. As Notebookcheck points out, the matte graininess is more noticeable in brighter scenes, so in high APL HDR scenes where the panel really lights up, a lot of the "wow" factor is lost in a combination of grain and contrast washout.

Given that the matte coating wipes out two of the main benefits of OLED for media, and the 4K IPS option is arguably better for pure workstation use, it's hard to disagree with Notebookcheck's conclusion that the IPS panel is the more sensible choice here. The Tandem OLED is compelling in theory, but the coating plus the higher power draw make the trade‑offs tough to justify. I really wanted to like it - without true OLED blacks and contrast, though, this panel ends up feeling like a bit of an odd option. On top of that, there's a slight green tint from the AOFT touch layer. Ideally, a future P1 Gen 9 would offer a better‑thought‑out Tandem OLED option - true 4K resolution (so no text clarity downgrade), a MacBook‑class glossy anti‑reflective coating, and no annoying visible AOFT touch layer. If Lenovo can ship a Tandem OLED configuration that's actually suited for content consumption, it would be very high on my list.

Battery life is also poor (likely a combo of Tandem OLED and the Nvidia GPU), and chassis temps are much hotter than expected when used on a lap, since the bottom air intake is very easy to block. My P1 Gen 4 also gets extremely hot in "couch mode," but I was really expecting the P1 Gen 8 to be an improvement here, so the fact that it feels even more uncomfortably hot to the touch is a real letdown.

So overall I think it's hard to imagine keeping the P1 Gen 8. Maybe next year panther lake will helps with the temp issues and Lenovo will start offering 4k Tandem OLEDs without the blur filter / vaseline look applied. Matte coatings can definitely be a great fit for IPS panels, but it just seems like a crime to put one on top of a $$$$ Tandem OLED like this.

EDIT: The more I think about my HDR experience with the P1 Gen 8's 3.2k Tandem OLED, the more I've realized it's nearly impossible to properly evaluate a Windows laptop's HDR display capabilities. HDR on Windows remains poorly implemented compared to macOS. After more testing, only Netflix and Apple TV+ actually deliver 4K or HDR streams on Windows - everything else (Disney+, Max, Paramount+, Peacock, etc.) is stuck at 1080p SDR.

And with Windows HDR toggled on, any SDR content looks washed out (grainy blacks, elevated contrast) until you manually disable it.

So while I still have issues with these new tandem OLED panels - especially the matte coating impairing the true OLED blacks - I suspect many negative comments about these new panels around the web stem from Windows' broken HDR pipeline plus the majority of streaming services sending inferior 1080p SDR video streams to to PC browsers/apps compared to the superior streams (4K/HDR) sent to TVs, streaming sticks, and high-end tablets (often higher bitrate 1080p HDR).
Title: Re: Lenovo ThinkPad P1 16 Gen 8 review: Tandem OLED series premiere
Post by: veraverav on December 26, 2025, 00:27:23
well, I powered up my P1 Gen 8 .. 265H/OLED/64GB RAM/2TB NVMe .. looks like that 5.5 hrs of battery life might be accurate at least on the "Balanced" profile for mixed-use. Haven't tried the one power setting below that "battery efficient". Not sure if the article mentioned which profile they used.

OLED is beautiful, but found latent image retention in darker environments where the brightness is at 25%. "After images" show up on dark grey and remain for minutes after. Possibly a defective panel.

Machine runs extremely cool and quiet. Lenovo has bee aggressive with power limits on the CPU. The one comment on YT with the guy who bough the non-dGPU one which doesn't use LM -- clearly getting a dGPU is the way to go if you want cool and quiet. Will the fans ramp up if you do something intense for extended periods, yes it will, but way better than my P1 Gen 2.
Title: Re: Lenovo ThinkPad P1 16 Gen 8 review: Tandem OLED series premiere
Post by: pelican-freeze on December 28, 2025, 23:11:16
Quote from: veraverav on December 26, 2025, 00:27:23well, I powered up my P1 Gen 8 .. 265H/OLED/64GB RAM/2TB NVMe .. looks like that 5.5 hrs of battery life might be accurate at least on the "Balanced" profile for mixed-use. Haven't tried the one power setting below that "battery efficient". Not sure if the article mentioned which profile they used.

OLED is beautiful, but found latent image retention in darker environments where the brightness is at 25%. "After images" show up on dark grey and remain for minutes after. Possibly a defective panel.

Machine runs extremely cool and quiet. Lenovo has bee aggressive with power limits on the CPU. The one comment on YT with the guy who bough the non-dGPU one which doesn't use LM -- clearly getting a dGPU is the way to go if you want cool and quiet. Will the fans ramp up if you do something intense for extended periods, yes it will, but way better than my P1 Gen 2.

Any more thoughts? Are you happy enough with the P1 Gen 8 to keep it?
Title: Re: Lenovo ThinkPad P1 16 Gen 8 review: Tandem OLED series premiere
Post by: Fortis on December 29, 2025, 14:24:06
Lenovo made a cardinal sin by putting clickpad without buttons on a ThinkPad. What's the point of doing so if you have already a ThinkBook P version?
Title: Re: Lenovo ThinkPad P1 16 Gen 8 review: Tandem OLED series premiere
Post by: veraverav on January 07, 2026, 06:47:03
Quote from: pelican-freeze on December 28, 2025, 23:11:16Any more thoughts? Are you happy enough with the P1 Gen 8 to keep it?

I'm asking them to replace it with the 4k IPS version, which they refuse, even though its cheaper. They only want to replace it with an identical machine with the OLED screen. I'm not sure what to do. I love the machine aside from this OLED issue. Am I rolling the dice asking for a replacement?

I ran PCMark10 office battery life test which got 10hrs of battery life.

I'm trying to escalate my issue to see if someone can send me the 4k IPS instead. So far no luck.

I got the TB5 7500 Smart Dock and it works very well with the machine as well.
Title: Re: Lenovo ThinkPad P1 16 Gen 8 review: Tandem OLED series premiere
Post by: pelican-freeze on January 11, 2026, 15:54:13
Quote from: veraverav on January 07, 2026, 06:47:03I'm asking them to replace it with the 4k IPS version, which they refuse, even though its cheaper. They only want to replace it with an identical machine with the OLED screen. I'm not sure what to do. I love the machine aside from this OLED issue. Am I rolling the dice asking for a replacement?

I ran PCMark10 office battery life test which got 10hrs of battery life.

I'm trying to escalate my issue to see if someone can send me the 4k IPS instead. So far no luck.

I got the TB5 7500 Smart Dock and it works very well with the machine as well.

Did you end up getting a resolution that you were happy with? Either a replacement with an OLED without the retention issue you mentioned or a swap for IPS?

After having the laptop for a while could you share some more impressions of the display and general quality of life experience using it in day to day use? Have you been able to tweak some settings to improve the battery life?

Still trying to decide whether or not to keep mine (Lenovo's Holiday return window in the US closes January 27th, 2026). I was hoping for some more exciting laptop announcements at CES that would encourage me to return the P1 G8 and wait for newer releases, but was a bit disappointed by the new product announcements.

Next gen Panther Lake CPUs don't seem like a proper generational leap for anything other than webcam quality (new ISP) outside of combined CPU / GPU scenarios where panther lake is being configured to take up less of the overall power budget / allow for more of the overall power budget to be allocated to higher GPU wattage. So panther lake seems to be an extremely minimal upgrade on the performance / efficiency / IO front unless you're ok with relying entirely on Intel's new integrated GPU or you have a discreet GPU that could see a significant performance increase if it were fed an additional 10-15w in combined CPU / GPU load scenarios.

There very few panther lake laptop announcements at CES featuring tandem OLEDs too and the ones that were announced all had major issues - limited 2.8k or lower resolutions, cost saving low quality anti-reflective coatings / full on mirror finishes, no thunderbolt 5, poor overall port selection, integrated intel GPU only etc..
Title: Re: Lenovo ThinkPad P1 16 Gen 8 review: Tandem OLED series premiere
Post by: veraverav on January 19, 2026, 07:42:34
Quote from: pelican-freeze on January 11, 2026, 15:54:13Did you end up getting a resolution that you were happy with? Either a replacement with an OLED without the retention issue you mentioned or a swap for IPS?

After having the laptop for a while could you share some more impressions of the display and general quality of life experience using it in day to day use? Have you been able to tweak some settings to improve the battery life?

Still trying to decide whether or not to keep mine (Lenovo's Holiday return window in the US closes January 27th, 2026). I was hoping for some more exciting laptop announcements at CES that would encourage me to return the P1 G8 and wait for newer releases, but was a bit disappointed by the new product announcements.

Next gen Panther Lake CPUs don't seem like a proper generational leap for anything other than webcam quality (new ISP) outside of combined CPU / GPU scenarios where panther lake is being configured to take up less of the overall power budget / allow for more of the overall power budget to be allocated to higher GPU wattage. So panther lake seems to be an extremely minimal upgrade on the performance / efficiency / IO front unless you're ok with relying entirely on Intel's new integrated GPU or you have a discreet GPU that could see a significant performance increase if it were fed an additional 10-15w in combined CPU / GPU load scenarios.

There very few panther lake laptop announcements at CES featuring tandem OLEDs too and the ones that were announced all had major issues - limited 2.8k or lower resolutions, cost saving low quality anti-reflective coatings / full on mirror finishes, no thunderbolt 5, poor overall port selection, integrated intel GPU only etc..

I am returning it and the P1 Gen 8 with the 4k IPS panel is on its way already. I knew they wouldn't tell us about the Gen 9 at CES since they were so late launching this model.
Title: Re: Lenovo ThinkPad P1 16 Gen 8 review: Tandem OLED series premiere
Post by: Philm on January 21, 2026, 20:33:22
Quote from: veraverav on January 19, 2026, 07:42:34I am returning it and the P1 Gen 8 with the 4k IPS panel is on its way already. I knew they wouldn't tell us about the Gen 9 at CES since they were so late launching this model.

I'll be interested to know what you think of the 4k IPS display. I'm torn between that and the OLED.
Title: Re: Lenovo ThinkPad P1 16 Gen 8 review: Tandem OLED series premiere
Post by: Nasty matte OLED? on January 22, 2026, 08:44:55
QuoteMatte RGB isn't as crisp as we would like
QuoteSubjectively, the matte tandem OLED screen on our test unit appears grainier than a traditional glossy alternative especially when displaying an all-white image.
Judging the close-up picture: Nasty. What's the purpose of using an OLED then.

I'm a glossy QD-OLED TV user and its especially awesome when gaming (in Game Mode, 144 Hz, Warm2), but if the battery life in laptops is decreased vs IPS and you have a lot of static content/burn-in risk (is it still a concern)? (nothing on my QD-OLED TV so far), then maybe go with the good old IPS after all, especially if you want its matte/anti-glare coating and the 4K one is 100% DCI-P3.

The PWM frequency on this OLED seems not of concern(?)

The OLED in SDR is "566 nits", the 4K IPS is "800 nits". I think Apple uses 500 nits in SDR as well (1000 nits in HDR only). So, maybe at least when it comes to nits, it might not be an issue, regardless which display option one chooses.

Yes, report back what you think of the 4K IPS.
Title: Re: Lenovo ThinkPad P1 16 Gen 8 review: Tandem OLED series premiere
Post by: pelican-freeze on January 31, 2026, 03:47:01
Quote from: veraverav on December 26, 2025, 00:27:23well, I powered up my P1 Gen 8 .. 265H/OLED/64GB RAM/2TB NVMe .. looks like that 5.5 hrs of battery life might be accurate at least on the "Balanced" profile for mixed-use. Haven't tried the one power setting below that "battery efficient". Not sure if the article mentioned which profile they used.

OLED is beautiful, but found latent image retention in darker environments where the brightness is at 25%. "After images" show up on dark grey and remain for minutes after. Possibly a defective panel.

Machine runs extremely cool and quiet. Lenovo has bee aggressive with power limits on the CPU. The one comment on YT with the guy who bough the non-dGPU one which doesn't use LM -- clearly getting a dGPU is the way to go if you want cool and quiet. Will the fans ramp up if you do something intense for extended periods, yes it will, but way better than my P1 Gen 2.

Did you return your OLED P1 Gen 8 / buy an IPS variant? I bought the OLED version and returned it due to all of the OLED issues discussed, but I'm curious how much better the IPS variant would be for battery life in addition to the overall quality of life improvements - no flickering, no image retention, no OLED text quality issues, better matte coating quality, no non-standard 3.2k resolution quirks etc..