Quote from: veraverav on Yesterday at 21:35:27Quote from: pelican-freeze on Yesterday at 14:48:36Are there any issues with the non-standard 3.2K resolution on a laptop intended for workstation use? I like the idea of a tandem OLED, but at 3.2K it seems like it might be more of a headache than anything else. At 1440p you get better battery life, cleaner application compatibility, and more flexibility for some light gaming, while at 3.2K you lose the battery life benefits of a lower resolution and risk app compatibility issues, without getting the content-consumption advantages of a proper 4K panel.
If the appeal of a tandem OLED workstation is being able to watch 4K HDR Netflix in your downtime, any sub‑4K display is still going to get the same 1080p stream (just like an iPad or Android tablet). To get true 4K HDR streaming you need a full 4K, HDCP 2.2‑compliant display; otherwise, you're capped at 1080p. So a 3.2k tandem OLED is objectively a poor solution for content consumption.
I'm interested in this laptop, but I'm concerned about weird interface scaling in professional apps that almost certainly haven't been tested for usability at 3200×2000. And having a tandem OLED that isn't suited for content consumption just feels like an awkward choice when Lenovo already offers such bright IPS panels - like with the 3.2k tandem OLED you get poor battery life and poor content consumption, but without a material increase in brightness to improve display visibility when in bright environments on the go since the P1 G8's 4k IPS panel option already offers 800 nits of SDR brightness. In practical SDR use the IPS panel is likely to be even brighter and offer even better visibility?
Like I'm a huge P1 / X1 Extreme fan and was excited about tandem OLED, but I can't figure out any use case where this 3.2k display makes sense since it appears to offer a big bag of frustrating compromises without any material benefits - bad for app compatibility, too high resolution for good battery life, too high for gaming, not high enough for good content consumption, no advantage over bright IPS option for outdoor use etc..
For those considering the P16 G3: you can configure CPUs/GPUs that, on paper, should scale to higher performance with more wattage, but the AC adapter is only 180W (vs. the P1 G8's 140W). That strongly suggests the P16 G3 chassis and power delivery are not designed to let parts like Intel HX CPUs or 5070 Ti/5080/5090‑class GPUs (RTX Pro 3000/4000/5000) run anywhere near their maximum performance. If your main interest in 5080/5090‑class GPUs is the extra NVENC encoders, the P16 G3 still looks great—much lighter and more compact than the P16 G2, with the tradeoff of lower cooling capacity, a smaller total power envelope, and reduced peak performance.
The only true 16‑inch performance workstation left this year is the Dell Pro Max 16 Plus (not Premium), since it ships with a 280W AC adapter and a much more performance‑oriented triple‑fan, vapor‑chamber cooling system. So if you care primarily about performance, the sweet spot is the Dell Pro Max 16 Plus with a 275HX (24 threads, fed full wattage) and a 5070 Ti‑class GPU (RTX 3000, at a full 175W), with the higher GPU tiers really only making sense for heavy video work. It's definitely not as compact as the P16 G3 and it's about a pound heavier once you factor in the adapter, but that seems like a very reasonable trade for the additional performance considering that there isn't really any competition at that mobile performance tier anymore.
Actually the 3.2K screen run @ 200% scaling will yield a better picture than 4K @ 250% scaling.
So far I've read zero complaints about watching 4k content on a 3.2k screen or anything like that. I guess someone with this laptop can chime in if they are unable to get a 4k stream because its a 3.2k screen. Everyone seems to love the 3.2k OLED screen.
As for the eGPU, it simply to get around the 8GB VRAM bottleneck without having to haul around something as large and heavy as the P16.
Quote from: veraverav on Yesterday at 21:35:27Actually the 3.2K screen run @ 200% scaling will yield a better picture than 4K @ 250% scaling.Correct. That's 1600x1000 (3.2K 200%) vs 1536x960 (4K 250%). Sure the 4K is a bit sharper but it's not really noticeable from any normal viewing distance. If one is touching the screen with their nose then they will probably notice some pixels here and there with a 3.2K, hehe. Another benefit of 3.2K is running it at 125% scaling, which is effectively the same real estate as 2560x1600, with 25% sharper image. 25% doesn't sound much, but if one works with text then it helps a lot as its easier on one's eyes no matter of how good their eyesight is.
Quote from: veraverav on Yesterday at 21:35:27So far I've read zero complaints about watching 4k content on a 3.2k screen or anything like that. I guess someone with this laptop can chime in if they are unable to get a 4k stream because its a 3.2k screen. Everyone seems to love the 3.2k OLED screen.I don't have this exact panel, but I used to own a Surface Pro 8 for a few months with a 2880x1920, so not the same resolution but very close. No issues with 4K, it all worked great (replaced the Surf with an X1 Carbon 9 though).
Quote from: veraverav on Yesterday at 21:35:27As for the eGPU, it simply to get around the 8GB VRAM bottleneck without having to haul around something as large and heavy as the P16.This 👍 You pay less for eGPU dock/enclosure + normal desktop GPU, and you get more performance (for far less money) than what you would get with that same laptop if its dGPU was maxed.
Quote from: pelican-freeze on Yesterday at 14:48:36Are there any issues with the non-standard 3.2K resolution on a laptop intended for workstation use? I like the idea of a tandem OLED, but at 3.2K it seems like it might be more of a headache than anything else. At 1440p you get better battery life, cleaner application compatibility, and more flexibility for some light gaming, while at 3.2K you lose the battery life benefits of a lower resolution and risk app compatibility issues, without getting the content-consumption advantages of a proper 4K panel.
If the appeal of a tandem OLED workstation is being able to watch 4K HDR Netflix in your downtime, any sub‑4K display is still going to get the same 1080p stream (just like an iPad or Android tablet). To get true 4K HDR streaming you need a full 4K, HDCP 2.2‑compliant display; otherwise, you're capped at 1080p. So a 3.2k tandem OLED is objectively a poor solution for content consumption.
I'm interested in this laptop, but I'm concerned about weird interface scaling in professional apps that almost certainly haven't been tested for usability at 3200×2000. And having a tandem OLED that isn't suited for content consumption just feels like an awkward choice when Lenovo already offers such bright IPS panels - like with the 3.2k tandem OLED you get poor battery life and poor content consumption, but without a material increase in brightness to improve display visibility when in bright environments on the go since the P1 G8's 4k IPS panel option already offers 800 nits of SDR brightness. In practical SDR use the IPS panel is likely to be even brighter and offer even better visibility?
Like I'm a huge P1 / X1 Extreme fan and was excited about tandem OLED, but I can't figure out any use case where this 3.2k display makes sense since it appears to offer a big bag of frustrating compromises without any material benefits - bad for app compatibility, too high resolution for good battery life, too high for gaming, not high enough for good content consumption, no advantage over bright IPS option for outdoor use etc..
For those considering the P16 G3: you can configure CPUs/GPUs that, on paper, should scale to higher performance with more wattage, but the AC adapter is only 180W (vs. the P1 G8's 140W). That strongly suggests the P16 G3 chassis and power delivery are not designed to let parts like Intel HX CPUs or 5070 Ti/5080/5090‑class GPUs (RTX Pro 3000/4000/5000) run anywhere near their maximum performance. If your main interest in 5080/5090‑class GPUs is the extra NVENC encoders, the P16 G3 still looks great—much lighter and more compact than the P16 G2, with the tradeoff of lower cooling capacity, a smaller total power envelope, and reduced peak performance.
The only true 16‑inch performance workstation left this year is the Dell Pro Max 16 Plus (not Premium), since it ships with a 280W AC adapter and a much more performance‑oriented triple‑fan, vapor‑chamber cooling system. So if you care primarily about performance, the sweet spot is the Dell Pro Max 16 Plus with a 275HX (24 threads, fed full wattage) and a 5070 Ti‑class GPU (RTX 3000, at a full 175W), with the higher GPU tiers really only making sense for heavy video work. It's definitely not as compact as the P16 G3 and it's about a pound heavier once you factor in the adapter, but that seems like a very reasonable trade for the additional performance considering that there isn't really any competition at that mobile performance tier anymore.
Quote from: pelican-freeze on Yesterday at 19:57:35I feel like you're making my point for me. Reading your post implies that the best option would be a P16 Gen 3 with a 255HX gaming / workstation CPU and the cheapest GPU option (1000 Blackwell) combined with a desktop 5070 TI in an eGPU setup?The best option is any Oculink or Thunderbolt 5 laptop, currently there's not that many of them to pick. So yes, the P16 Gen 3 is one of those few, that's correct. The best one is the one with the best thermal "picture", because if it can't hold stable PL1 Wattage and frequencies it will be all over the place, which is why the Razer Blade 16 is probably quite meh as its CPU is like watching a heart monitor graph. Can't wait for an in-depth P16 Gen 3 review though!
Quote from: pelican-freeze on Yesterday at 19:57:35At no point have I said that a high wattage eGPU setup won't provide better performance compared to a low wattage discreet mobile GPU, even when connected to an ultraportable laptop.Yes, you are correct with this, the part in bold especially. Six cores is proven to be enough for all games, in laptops, in desktops, perfectly enough. The 9600X is a monster of a CPU, yet inexpensive, and it's trailing just about 10% behind the 9800X3D which is more expensive but also slightly faster due to its fast cache (96 MB L3 vs 32 MB L3 in the 9600X). But if you compare the 9700X with its 8 cores to that same 9600X with its 6 cores - they are pretty much identical, down to a single fps, like getting 165 fps in Battlefield 6 with 9700X vs 164 fps with 9600X. If two more cores are needed to get just 1 extra fps...
Also, if you primarily want to play older titles like Shadow of the Tomb Raider that were released before the current console generation than it goes without saying that they will be much less CPU intensive than current generation games. I can absolutely see the CPU bottleneck being less of an issue for gaming if you primarily play games released before 2020 / before the PlayStation 5 & Xbox Series X were both released with 8 performance CPU cores.
Quote from: pelican-freeze on Yesterday at 19:57:35But why would you pay $$$$ for a desktop 5070 TI if you only wanted to play older games?I play many games just fine, actually just finished Doom The Dark Ages recently after getting it on a nice 50% discount on Black Friday. That's one of the most demanding games of 2025, "ran" it at a literal 2 fps on my X1 Carbon with its iGPU, but it was a decent 45-ish fps experience with 5070 Ti at 1440p DLSS Q. However, older CPU with just 4 cores with less IPC and lower clocks in general simply struggles to push more than that, even when added more Watts to it (PL1 set to 29W instead of 22). It was at 95-100% usage at 2 fps and 45-ish fps, btw 😁
Quote from: Worgarthe on Yesterday at 19:37:09Quote from: pelican-freeze on Yesterday at 19:00:29Honestly, just read what I wrote? Nothing in your comment appears to be a direct response to anything I said. Obviously you can game with lower wattage CPUs and eGPUs - you'll just get worse performance. And obviously if you want to avoid your eGPU being performance limited / bottlenecked by your CPU choice you can just choose a CPU that is a better fit for gaming.Worse performance, yet still significantly better than what you get with gaming on iGPU or dGPU in that same laptop (if a GPU in an eGPU dock/enclosure is more powerful, naturally).Quote from: pelican-freeze on Yesterday at 19:00:29Spending $$$$ on a high wattage desktop GPU that you know will be performance constrained by a low wattage mobile CPU is clearly not going to provide the best experience but no one is stopping you from doing it if you want. Also, you can't put a higher wattage mobile CPU intended for gaming use (255HX+) into a 4 lb ultraportable laptop so if your goal is "ultraportable laptop that can play games at home" that rules out the higher wattage enthusiast class CPUs that are designed for gaming / workstation laptop use.
- https://www.notebookchat.com/index.php?topic=256048.msg694421#msg694421
- https://www.notebookchat.com/index.php?topic=256048.msg694430#msg694430
Quote from: pelican-freeze on Yesterday at 19:00:29So there's nothing stopping you from using an eGPU / mobile CPU combo where the CPU is the performance bottleneck. It's clearly not possible to argue though that the best mobile CPU choices for gaming aren't the higher wattage mobile CPUs *explicitly* designed and marketed by Intel / AMD as mobile gaming CPUs.The CPU is always a bottleneck, no matter of TDP. If you check any of the links above, for example that video with the Legion Go + 3070 eGPU, you will see that the CPU usage is basically identical in all scenarios - with or without eGPU; for example, as clearly demonstrated in the video, in Final Fantasy 7 Rebirth the CPU at 1080p low is at 21-24% to get 26 average fps with iGPU, yet once the 3070 eGPU is plugged in there is 75 average fps at 1440p high settings with the CPU usage at exactly the same 21-24%.
I have 5070 Ti, when I try to play Shadow of the Tomb Raider for example (I'm actually currently replaying the reboot series) with my X1 Carbon Gen 9 and its iGPU (i7 1165G7, Iris Xe) at absolutely lowest possible settings at 720p, I get barely 30 fps with 55-60% CPU usage. When I plug in my 5070 Ti I get 160+ fps at highest/maxed settings without Ray Tracing and 80-100 with Ray Tracing, all at 1440p, and the CPU is still at 55-60% usage.
Which is more than what I get with my P16 Gen 2 (i7 14700HX + ADA 3500 12 GB), where I get 110-120 at the same highest/maxed settings without RT, and about 55-70 with RT on. With eGPU plugged in to that same P16 Gen 2, the same 5070 Ti is pushing 170-ish at maxed settings, and around 100-110 with same settings + Ray Tracing on. All 1440p, of course.
Is a faster CPU faster than a slower CPU? Yes. Does wasting away insane amounts of money for top specs makes sense when a 750€-ish GPU is still going to obliterate those specs even with slightly reduced bandwidth due to its eGPU setup? Well, that's up to each person to decide how much they hate their own money ¯\_(ツ)_/¯
Again, for clarity:Quote from: Worgarthe on December 12, 2025, 04:44:47I just checked prices in Germany, but for the P16 Gen 3 because it is possible to equip it with up to 24 GB VRAM (Blackwell 5000).
The base version 8 GB (Blackwell 1000) config goes for 2819 € currently. Prices for GPU upgrades are the following:
- 2000 Blackwell (8 GB) +230€
- 3000 Blackwell (12 GB) +790€
- 4000 Blackwell (16 GB) +1420€
- 5000 Blackwell (24 GB) +2980€ (😂)
The rest of the specs is untouched from the base config, so 245HX + 16 GB RAM + 512 GB SSD, with only the display being automatically improved to a 2400p panel (no option to keep the base 1200p panel with that GPU).
So I just added that GPU and literally nothing else, the laptop is now 6039€, that's an increase of +3220€ (!!) just because of 24 GB GPU!
Let's see how much VRAM we can get with 3220€, to put that in an eGPU setup while keeping the P16 Gen 3 at its base price: https://www.idealo.de/preisvergleich/OffersOfProduct/205942083_-geforce-rtx-5070-ti-gigabyte.html
3220/759=4. So four RTX 5070 Ti. Meaning 64 GB of VRAM. Meaning 40 GB more, for the same price. The 5000 Blackwell is similar in performance as a 4070 Super and 3080 Ti. The 5070 Ti is simply far ahead performance-wise, and with four of them at full power of 300W each - it's even more tragic to compare of what you get for the same amount of money... ¯\_(ツ)_/¯
Quote from: pelican-freeze on Yesterday at 19:00:29Honestly, just read what I wrote? Nothing in your comment appears to be a direct response to anything I said. Obviously you can game with lower wattage CPUs and eGPUs - you'll just get worse performance. And obviously if you want to avoid your eGPU being performance limited / bottlenecked by your CPU choice you can just choose a CPU that is a better fit for gaming.Worse performance, yet still significantly better than what you get with gaming on iGPU or dGPU in that same laptop (if a GPU in an eGPU dock/enclosure is more powerful, naturally).
Quote from: pelican-freeze on Yesterday at 19:00:29Spending $$$$ on a high wattage desktop GPU that you know will be performance constrained by a low wattage mobile CPU is clearly not going to provide the best experience but no one is stopping you from doing it if you want. Also, you can't put a higher wattage mobile CPU intended for gaming use (255HX+) into a 4 lb ultraportable laptop so if your goal is "ultraportable laptop that can play games at home" that rules out the higher wattage enthusiast class CPUs that are designed for gaming / workstation laptop use.
Quote from: pelican-freeze on Yesterday at 19:00:29So there's nothing stopping you from using an eGPU / mobile CPU combo where the CPU is the performance bottleneck. It's clearly not possible to argue though that the best mobile CPU choices for gaming aren't the higher wattage mobile CPUs *explicitly* designed and marketed by Intel / AMD as mobile gaming CPUs.The CPU is always a bottleneck, no matter of TDP. If you check any of the links above, for example that video with the Legion Go + 3070 eGPU, you will see that the CPU usage is basically identical in all scenarios - with or without eGPU; for example, as clearly demonstrated in the video, in Final Fantasy 7 Rebirth the CPU at 1080p low is at 21-24% to get 26 average fps with iGPU, yet once the 3070 eGPU is plugged in there is 75 average fps at 1440p high settings with the CPU usage at exactly the same 21-24%.
Quote from: Worgarthe on December 12, 2025, 04:44:47I just checked prices in Germany, but for the P16 Gen 3 because it is possible to equip it with up to 24 GB VRAM (Blackwell 5000).
The base version 8 GB (Blackwell 1000) config goes for 2819 € currently. Prices for GPU upgrades are the following:
- 2000 Blackwell (8 GB) +230€
- 3000 Blackwell (12 GB) +790€
- 4000 Blackwell (16 GB) +1420€
- 5000 Blackwell (24 GB) +2980€ (😂)
The rest of the specs is untouched from the base config, so 245HX + 16 GB RAM + 512 GB SSD, with only the display being automatically improved to a 2400p panel (no option to keep the base 1200p panel with that GPU).
So I just added that GPU and literally nothing else, the laptop is now 6039€, that's an increase of +3220€ (!!) just because of 24 GB GPU!
Let's see how much VRAM we can get with 3220€, to put that in an eGPU setup while keeping the P16 Gen 3 at its base price: https://www.idealo.de/preisvergleich/OffersOfProduct/205942083_-geforce-rtx-5070-ti-gigabyte.html
3220/759=4. So four RTX 5070 Ti. Meaning 64 GB of VRAM. Meaning 40 GB more, for the same price. The 5000 Blackwell is similar in performance as a 4070 Super and 3080 Ti. The 5070 Ti is simply far ahead performance-wise, and with four of them at full power of 300W each - it's even more tragic to compare of what you get for the same amount of money... ¯\_(ツ)_/¯
Quote from: Worgarthe on Yesterday at 16:34:16Quote from: pelican-freeze on Yesterday at 14:48:36Are there any issues with the non-standard 3.2K resolution on a laptop intended for workstation use? I like the idea of a tandem OLED, but at 3.2K it seems like it might be more of a headache than anything else. At 1440p you get better battery life, cleaner application compatibility, and more flexibility for some light gaming, while at 3.2K you lose the battery life benefits of a lower resolution and risk app compatibility issues, without getting the content-consumption advantages of a proper 4K panel.There is absolutely nothing wrong with this resolution (or any other). Scaling exist for that exact reason. Increased power consumption comes from the tech, not a resolution - a 16" 1200p 120 Hz OLED is going to consume far more power than a 16" 2400p 120 Hz IPS, for example.Quote from: pelican-freeze on Yesterday at 15:28:27Plugging in an eGPU wouldn't resolve the issue as the P1 G8's CPU options are all low wattage chips configured with a maximum of 6 performance cores. For an eGPU to make sense on the Intel side you'd need at least a 255HX (8P+12E), or, on the AMD side a chip like a Ryzen AI 9 HX 370 (8P+4E). It's hard to overstate the degree to which lower power mobile CPUs right now are not a good fit for gaming - mobile GPUs are just cut down so dramatically from their desktop GPU counterparts though that this is less of an issue when pairing mobile CPUs with mobile GPUs. Pairing mobile CPUs with desktop GPUs you're going to run into serious bottlenecks if you don't pick the right chip - at least 8 performance cores, more watts, more performance scaling ability when feds those extra watts etc..You knowledge about eGPU and how eGPU work is clearly somewhere between zero and nothing, nhf. Literally this whole comment is wrong.
TLDR: If you want the option to use an eGPU with a thinkpad workstation buy a P16 Gen 3 with a 255HX (or higher). The P16 Gen 3 *should* offer even better CPU performance when it doesn't need to allocate thermal headroom to cooling the internal discreet GPU. This would be the best fit - high wattage desktop-ish class CPU actually running at reasonably high / close to max wattage in the laptop itself, eGPU also running at reasonably high / close to max wattage outside the laptop chassis.
Check egpu.io for more info and to learn something instead of being confidently wrong: https://egpu.io/best-external-graphics-card-builds/
Or r/eGPU: https://www.reddit.com/r/eGPU/
Or a 30 Watt CPU handheld - Legion Go with eGPU (fairly popular combo): https://www.youtube.com/watch?v=5opYdgDtK0s
Or, heck, even an 5-year old ThinkPad with a 15W CPU (with UHD 620 integrated graphics) + RTX 4090: https://www.youtube.com/watch?v=EOsGqAeyCtA
Or... Basically I can keep going endlessly.
Tl; dr - the truth is polar opposite of your comment.
Edit: Forgot to include this, from Notebookcheck:
X1 Carbon Gen 6 (15W CPU) tested with eGPU, back in 2018: https://www.notebookcheck.net/Lenovo-ThinkPad-X1-Carbon-2018-WQHD-HDR-i7-Laptop-Review.284682.0.html
Quote from: pelican-freeze on Yesterday at 14:48:36Are there any issues with the non-standard 3.2K resolution on a laptop intended for workstation use? I like the idea of a tandem OLED, but at 3.2K it seems like it might be more of a headache than anything else. At 1440p you get better battery life, cleaner application compatibility, and more flexibility for some light gaming, while at 3.2K you lose the battery life benefits of a lower resolution and risk app compatibility issues, without getting the content-consumption advantages of a proper 4K panel.There is absolutely nothing wrong with this resolution (or any other). Scaling exist for that exact reason. Increased power consumption comes from the tech, not a resolution - a 16" 1200p 120 Hz OLED is going to consume far more power than a 16" 2400p 120 Hz IPS, for example.
Quote from: pelican-freeze on Yesterday at 15:28:27Plugging in an eGPU wouldn't resolve the issue as the P1 G8's CPU options are all low wattage chips configured with a maximum of 6 performance cores. For an eGPU to make sense on the Intel side you'd need at least a 255HX (8P+12E), or, on the AMD side a chip like a Ryzen AI 9 HX 370 (8P+4E). It's hard to overstate the degree to which lower power mobile CPUs right now are not a good fit for gaming - mobile GPUs are just cut down so dramatically from their desktop GPU counterparts though that this is less of an issue when pairing mobile CPUs with mobile GPUs. Pairing mobile CPUs with desktop GPUs you're going to run into serious bottlenecks if you don't pick the right chip - at least 8 performance cores, more watts, more performance scaling ability when feds those extra watts etc..You knowledge about eGPU and how eGPU work is clearly somewhere between zero and nothing, nhf. Literally this whole comment is wrong.
TLDR: If you want the option to use an eGPU with a thinkpad workstation buy a P16 Gen 3 with a 255HX (or higher). The P16 Gen 3 *should* offer even better CPU performance when it doesn't need to allocate thermal headroom to cooling the internal discreet GPU. This would be the best fit - high wattage desktop-ish class CPU actually running at reasonably high / close to max wattage in the laptop itself, eGPU also running at reasonably high / close to max wattage outside the laptop chassis.
Quote from: veraverav on December 11, 2025, 19:24:06Quote from: 2k for 8GB VRAM gg on December 11, 2025, 10:08:06Quote8 GB VRAM2000 for only 8 GB VRAM? Nice trolling.
Even games have a problem with only 8 GB VRAM: youtube.com/watch?v=ric7yb1VaoA: "Gaming Laptops are in Trouble - VRAM Testing w/ @Hardwareunboxed"
Most big games are made for consoles first in mind and the PS5 has 16 GB VRAM, minus 4 GB for the OS, and games expect your GPU to have at least 12 GB VRAM.
Running local LLMs / AI has been a thing for a few years now, using llama.cpp and its webUI is all you need. A LLM can be fully loaded into the GPU's VRAM or, if the LLM can't fit, parts of it can be offloaded to system RAM. This laptop has 32 GB RAM + 8 GB VRAM. Small and better capable, big open-weights LLMs exist and the more RAM+VRAM your PC has, the better. Every GB helps. So, from 8 GB to 12 GB to 16 GB VRAM would already be a good to very good improvement.
But its not really meant to be a gaming laptop. Would plugging in an eGPU resolve this bottleneck for someone that absolutely has to game?
Quote from: anzej on December 13, 2025, 15:43:17I'd strongly recommend choosing a model with the dGPU, even if you don't need it. The non-dGPU models come with a smaller heatsink without liquid metal that fails to cover all the VRMs. This causes VRM overheating, which severely limits CPU package power.
You can verify this yourself by comparing spare part images for 5H41R89131 (non-dGPU) versus 5H41R89134 (dGPU).
I experienced this exact problem on my P1 G5 without a dGPU, as it could only sustain 38W CPU package power long-term due to VRM thermal issues. Lenovo Premier Support confirmed the design flaw and offered a full refund, explicitly recommending I purchase a dGPU-equipped unit instead.
TL;DR: Get the dGPU model regardless of your graphics needs if you want to get decent sustained CPU performance.
Quote from: pascal76 on December 12, 2025, 10:39:18FYI, I found this comment in a YT video
"I ordered a P1 Gen 8 on release with a 265h, 64gb ram and no dedicated GPU because I wanted best battery life and lowest noise on this machine. Sent it back after a day, the fan is not just "you notice it", it's atrociously loud. Even in "everyday tasks" or on idle the fan kicks in all the time, extremely annoying."