Gamer's Edge AI Inference
Written by Dominik Pantůček on 2025-04-24
llmLast time we have shown how to run (or more like crawl) LLM inference on common CPUs. It can be rather sluggish though. And that is why we dug deeper and tried to use a commodity hardware - a gaming laptop - to speed things up! It is actually very interesting how even a gaming GPU can increase performance of AI tasks.