Quote from: gg 30B LLMs at 8-bit on January 29, 2026, 09:44:0332 GB RAM is already e-waste for people who want to run a ~30B LLM at an 8-bit quant (30 GB). Manufacturers, we need at least 48 GB RAM. 96 GB RAM would be good for Gpt-Oss-120B with almost full context and 0 complaints (well, except for prompt processing maybe if there's no fast GPU, but still would be nice).
No. We absolutely don't. Just buy a specific machine for that s*** on your local network or even self host for external access. That in a laptop is extremely dumb