News:

Willkommen im Notebookcheck.com Forum! Hier können Sie über alle unsere Artikel und allgemein über notebookrelevante Dinge diskutieren. Viel Spass!

Main Menu

Post reply

Other options
Verification:
Please leave this box empty:
Shortcuts: ALT+S post or ALT+P preview

Topic summary

Posted by Hahaha
 - Today at 11:52:30
LG, or any other such manufacturer for that matter, doesn't have and don't make any specialized hardware that would help running LLMs, not even APPLE does: For end-users, inferencing/all is limited solely by the memory capacity and memory bandwidth. (again, if you want not only run, but also fine-tune or even train from the ground up, CUDA would be the easiest way to go about it)
Posted by Hahaha
 - Today at 11:52:21
For the ones who don't know: You can go to e.g. reddit/r/localLLAMA or /r/localLLM and read about all kind of open-weight LLMs, which you then can download and run already on any common hardware.

Text-to-text LLMs are only limited by how much memory you have (once the LLM fits into the memory, you're good to go and have decent speeds (the speed is mostly memory bandwidth dependent))..and CUDA support may only be required for finetuning, which endconsumers don't need.
Posted by Hahaha
 - Today at 11:38:39
QuoteThe devices will feature the proprietary Exaone generative AI model
So, put a LLM on common, unspecial (expected), hardware, add "AI" in product name and call it a day..lame.

Quotewith the 16-inch model weighing only 1.199 kg, according to LG.
This would be the actual interesting thing about this laptop, not the fake "special" AI capability.
Posted by Redaktion
 - Today at 10:50:47
LG Electronics has revealed it will bring new ultra-light LG Gram laptops to CES 2026. The devices will feature the proprietary Exaone generative AI model and extended battery performance.

https://www.notebookcheck.net/LG-Gram-Pro-AI-laptops-to-debut-with-Exaone-3-5-and-long-battery-life-at-CES.1194712.0.html