Quote from: not spending 10k on Yesterday at 22:29:54for inferencing, only (V)RAM size and bandwidth matter
Wrong. Different inferencing applications have different hardware needs! The inferencing application I use does not care for VRAM size or bandwidth but cares for Nvidia library versions, numbers of all types of GPU cores, GPU clock, RAM size (if I let inferencing run for a long time on the same analysed object: typically >16GB for >5min, >32GB for >80min), and only then CPU clock and possibly more than 16 CPU threads.