News:

Willkommen im Notebookcheck.com Forum! Hier können sie über alle unsere Artikel und allgemein über Notebook relevante Dinge disuktieren. Viel Spass!

Main Menu

Tuxedo Infinity Book Pro 14 Gen10 Review - Linux ultrabook with AMD Zen 5 & 128 GB RAM

Started by Redaktion, August 27, 2025, 09:56:21

Previous topic - Next topic

Redaktion

The Linux ultrabook Tuxedo InfinityBook Pro 14 is now available with AMD Zen 5 processors and upgradeable RAM. In addition to good maintenance options and numerous ports, the 120 Hz screen is now a bit brighter.

https://www.notebookcheck.net/Tuxedo-Infinity-Book-Pro-14-Gen10-Review-Linux-ultrabook-with-AMD-Zen-5-128-GB-RAM.1095463.0.html


MM128

Pathetic you say? A pretty powerful CPU, iGPU on par with the 1660 Ti Max-Q and a pretty fast 128 GB RAM, YES 128 GB in a 1,5 kg case is pathetic for you? Well... :))

M4M

1900Euro for this ugly cheap garbage E-Waste CRAZY!! M4 Macbook Pro all day at this price, or cheapest M4Pro MBP, smokes this s*** by 2-3x in perf, looks good, feels good, sounds good.
This E-Waste linux garbage looks ulgy and hideous just like every CS student, disgusting!!
Alpha extrovert leaders use premium mac or windows laptops.

AlexZ

1450 euro (~$1687) when configured with AMD Ryzen AI 9 HX 370, 64GB RAM and 2TB SSD.  This is not a bad price IMO.

MM128

@m4m
Oh, you're here again, you poor, complexed wretch. What idiot buys a 16GB laptop with a crippled operating system for 2500-3000 to use as a typewriter? Well, about 10% of the population does 😃

The "rest of us", who need to work (in a real job), use those "terrible" computers with the Windows operating system. Who us? Well, the entire aviation and automotive industries, etc., etc., etc... When it comes to servers, they use Windows or Linux.

Somehow I can't remember which industry would go bankrupt without iCrippled OS. Aaaaah, I know, maybe "artists", graphic designers and stupid YouTubers - that is, Photoshop, Lightroom and some video editing programs... That is it.

You should get a job (at least toilet cleaning), so your mom doesn't have to buy you your overpriced Mac, which you're very unhappy with because you can't play games on it 😂😂😂

MM128

For iBozo, I meant that you are "full of complexes", but I think you know that even without me 😂

Why 128 GB RAM?

For the ones who don't know:
QuoteDDR5-5600, Dual-Channel
2 * 64-bit * 5600 / 1000 / 8 = 89.6 GB/s theoretically and practically:
Quote84943 MB/s

AMD "Strix Halo" APU is also (up to) 128 GB RAM, but is quad-channel (entry level workstation memory bandwidth territory) (and has a better iGPU) (of course, also more expensive):
4 * 64-bit * 8000 / 1000 / 8 = 256 GB/s.

The higher the memory bandwidth, the faster AI / LLMs will run on your PC.

Ok, but why/who would need 128 GB RAM all of a sudden?
The latest hype is AI/running LLMs. Download e.g. LM Studio (wrapper for llama.cpp) and try it out.
The difference to ChatGPT is that it is private (and, to be fair, not as good as the best that proprietary AI has to offer, but it's catching up quick (and if the open-weight LLMs are always ~1 year behind, does it matter? (the difference is actually getting closer: as per artificialanalysis.ai/?intelligence-tab=openWeights report)).

You can also download llama.cpp directly and the LLMs can be downloaded from huggingface.co.
Currently SOTA models that fit into 128 GB RAM (or fit using a decent quant, not the full FP16) are e.g.:
huggingface.co/unsloth/GLM-4.5-Air-GGUF
huggingface.co/unsloth/gpt-oss-120b-GGUF (from the makers of ChatGPT)
huggingface.co/unsloth/Qwen3-30B-A3B-Instruct-2507-GGUF
huggingface.co/unsloth/Qwen3-Coder-30B-A3B-Instruct-GGUF

Better, but won't fit:
huggingface.co/unsloth/Qwen3-235B-A22B-Instruct-2507-GGUF (tho the 2-bit and 3-bit quants might)
huggingface.co/unsloth/GLM-4.5-GGUF
huggingface.co/unsloth/DeepSeek-V3.1-GGUF

and many more and thinking variants like huggingface.co/Qwen/Qwen3-30B-A3B-Thinking-2507.

News and infos about AI / LLMs: reddit.com/r/LocalLLaMA/top/

NOTEBOOKCHECK, maybe write an article to explain to your readers about the AI LLM self-hosting ability or why 128 GB are a thing all of a sudden, because normally, 32 GB RAM are enough for most people.

1-0

Wasn't there like a recent study by MIT showing 95% of businesses using LLMs have made no money from it and actually lost money from investing in AI, if I recall correctly?

So useful for the 5% and they probably would of been fine and making tons even without it too.

Interesting

I like the fact that this is one of the few AMD options that comes with the option of intel WiFi instead of the standard mediatek that's usually bundled. It also comes with the option of a variety of keyboard layouts (personally I prefer US over UK, for the bigger left shift key).

Perhaps the way for AMD moving forward is going with smaller boutique laptop vendors because the big brands aren't doing this.

The downside is it's slightly expensive by a couple hundred euros but then again they probably don't have access the same economy of scale pricing as the bigger brands due to selling much higher volume of units.

Quick Reply

Name:
Email:
Verification:
Please leave this box empty:

Shortcuts: ALT+S post or ALT+P preview