Recent posts
#91
Last post by Redaktion - Yesterday at 15:40:47
#92
Last post by KaiM - Yesterday at 15:32:07
Aber es sind doch zumindest positive Nachrichten, dass Honor wohl für Europa anstatt die Akkukapazität massiv zu kastrieren auf einen Dual-Cell-Akku setzt und somit die lächerliche 20-Wh-Begrenzungsproblematik umschifft.
#93
Last post by Redaktion - Yesterday at 15:31:29
#94
Quote from: veraverav on December 11, 2025, 19:24:06Quote from: 2k for 8GB VRAM gg on December 11, 2025, 10:08:06Quote8 GB VRAM
2000 for only 8 GB VRAM? Nice trolling.
Even games have a problem with only 8 GB VRAM: youtube.com/watch?v=ric7yb1VaoA: "Gaming Laptops are in Trouble - VRAM Testing w/ @Hardwareunboxed"
Most big games are made for consoles first in mind and the PS5 has 16 GB VRAM, minus 4 GB for the OS, and games expect your GPU to have at least 12 GB VRAM.
Running local LLMs / AI has been a thing for a few years now, using llama.cpp and its webUI is all you need. A LLM can be fully loaded into the GPU's VRAM or, if the LLM can't fit, parts of it can be offloaded to system RAM. This laptop has 32 GB RAM + 8 GB VRAM. Small and better capable, big open-weights LLMs exist and the more RAM+VRAM your PC has, the better. Every GB helps. So, from 8 GB to 12 GB to 16 GB VRAM would already be a good to very good improvement.
But its not really meant to be a gaming laptop. Would plugging in an eGPU resolve this bottleneck for someone that absolutely has to game?
Plugging in an eGPU wouldn't resolve the issue as the P1 G8's CPU options are all low wattage chips configured with a maximum of 6 performance cores. For an eGPU to make sense on the Intel side you'd need at least a 255HX (8P+12E), or, on the AMD side a chip like a Ryzen AI 9 HX 370 (8P+4E). It's hard to overstate the degree to which lower power mobile CPUs right now are not a good fit for gaming - mobile GPUs are just cut down so dramatically from their desktop GPU counterparts though that this is less of an issue when pairing mobile CPUs with mobile GPUs. Pairing mobile CPUs with desktop GPUs you're going to run into serious bottlenecks if you don't pick the right chip - at least 8 performance cores, more watts, more performance scaling ability when feds those extra watts etc..
TLDR: If you want the option to use an eGPU with a thinkpad workstation buy a P16 Gen 3 with a 255HX (or higher). The P16 Gen 3 *should* offer even better CPU performance when it doesn't need to allocate thermal headroom to cooling the internal discreet GPU. This would be the best fit - high wattage desktop-ish class CPU actually running at reasonably high / close to max wattage in the laptop itself, eGPU also running at reasonably high / close to max wattage outside the laptop chassis.
#95
Last post by Redaktion - Yesterday at 15:27:09
#96
Last post by Redaktion - Yesterday at 15:08:41
#97
Last post by Redaktion - Yesterday at 14:53:40
#98
Are there any issues with the non-standard 3.2K resolution on a laptop intended for workstation use? I like the idea of a tandem OLED, but at 3.2K it seems like it might be more of a headache than anything else. At 1440p you get better battery life, cleaner application compatibility, and more flexibility for some light gaming, while at 3.2K you lose the battery life benefits of a lower resolution and risk app compatibility issues, without getting the content-consumption advantages of a proper 4K panel.
If the appeal of a tandem OLED workstation is being able to watch 4K HDR Netflix in your downtime, any sub‑4K display is still going to get the same 1080p stream (just like an iPad or Android tablet). To get true 4K HDR streaming you need a full 4K, HDCP 2.2‑compliant display; otherwise, you're capped at 1080p. So a 3.2k tandem OLED is objectively a poor solution for content consumption.
I'm interested in this laptop, but I'm concerned about weird interface scaling in professional apps that almost certainly haven't been tested for usability at 3200×2000. And having a tandem OLED that isn't suited for content consumption just feels like an awkward choice when Lenovo already offers such bright IPS panels - like with the 3.2k tandem OLED you get poor battery life and poor content consumption, but without a material increase in brightness to improve display visibility when in bright environments on the go since the P1 G8's 4k IPS panel option already offers 800 nits of SDR brightness. In practical SDR use the IPS panel is likely to be even brighter and offer even better visibility?
Like I'm a huge P1 / X1 Extreme fan and was excited about tandem OLED, but I can't figure out any use case where this 3.2k display makes sense since it appears to offer a big bag of frustrating compromises without any material benefits - bad for app compatibility, too high resolution for good battery life, too high for gaming, not high enough for good content consumption, no advantage over bright IPS option for outdoor use etc..
For those considering the P16 G3: you can configure CPUs/GPUs that, on paper, should scale to higher performance with more wattage, but the AC adapter is only 180W (vs. the P1 G8's 140W). That strongly suggests the P16 G3 chassis and power delivery are not designed to let parts like Intel HX CPUs or 5070 Ti/5080/5090‑class GPUs (RTX Pro 3000/4000/5000) run anywhere near their maximum performance. If your main interest in 5080/5090‑class GPUs is the extra NVENC encoders, the P16 G3 still looks great—much lighter and more compact than the P16 G2, with the tradeoff of lower cooling capacity, a smaller total power envelope, and reduced peak performance.
The only true 16‑inch performance workstation left this year is the Dell Pro Max 16 Plus (not Premium), since it ships with a 280W AC adapter and a much more performance‑oriented triple‑fan, vapor‑chamber cooling system. So if you care primarily about performance, the sweet spot is the Dell Pro Max 16 Plus with a 275HX (24 threads, fed full wattage) and a 5070 Ti‑class GPU (RTX 3000, at a full 175W), with the higher GPU tiers really only making sense for heavy video work. It's definitely not as compact as the P16 G3 and it's about a pound heavier once you factor in the adapter, but that seems like a very reasonable trade for the additional performance considering that there isn't really any competition at that mobile performance tier anymore.
#99
Last post by opckieran - Yesterday at 14:44:15
Could also be due to the fact that NVMe is king now and that most people who still need a SATA SSD will be fine picking one of the many no-name options on Amazon.
#100
Last post by Alex29125 - Yesterday at 14:41:56
This watch looks so cheao