Quote from: reddit.com/r/LocalLLaMA/comments/1sq94qx/is_anyone_getting_real_coding_work_done_with.. I've come to the conclusion that (1) 32768 is the biggest context I can get away with in an adequately smart model, and (2) it just ain't enough.(Here is a good (V)RAM requirement calc: huggingface.co/spaces/oobabooga/accurate-gguf-vram-calculator (paste e.g. this into its "GGUF Model URL" field: huggingface.co/unsloth/Qwen3.6-35B-A3B-GGUF/blob/main/Qwen3.6-35B-A3B-UD-Q4_K_M.gguf).)
QuoteThe nearly 11-hour WLAN runtime is excellentWith a 75wh battery, are you f* joking?? Thats abysmal and the status quo 5 years ago with 50wh batteries.