QuoteDespite its small dimensions, the mini PC allows up to 128 GB DDR5 RAM
If you're new and don't know why the author would say this so early in the text: This is for running AI / LLMs on your own computer, something like ChatGPT, but private and possibly better. Even by the same makers:
huggingface.co/openai/gpt-oss-120b
huggingface.co/unsloth/gpt-oss-120b-GGUF
For this particular 120B LLM, 64GB RAM won't be enough (unless you have a PC with a dedicated GPU, then you could offload some of it to the GPU' VRAM), as some of the RAM is required for the OS itself. So you'd need 2x48GB or 2x64GB. (Theoretically, running off of a SSD will work, even HDD, but it will be much, much, (much,) slower than running off of RAM.)
Other SOTA, but less over-censored LLMs (OpenAI wanted to be careful, so they over-censored their first big open-weight LLM), that will fit into 128GB RAM:
huggingface.co/unsloth/Qwen3-30B-A3B-Instruct-2507-GGUF
huggingface.co/unsloth/Qwen3-Coder-30B-A3B-Instruct-GGUF
huggingface.co/unsloth/GLM-4.5-Air-GGUF
Qwen3-Next-80B-A3B-Instruct (github.com/ggml-org/llama.cpp/issues/15940)