Quantcast
Channel: Raspberry Pi Forums
Viewing all articles
Browse latest Browse all 4397

Advanced users • Performance increase with LLMs running in Ollama by overclocking?

$
0
0
I'm running Llama3.2:3b in Ollama on a Pi 5 8GB. It's running well, but responses to inputs take anywhere from 6 to 50 seconds depending on the length of the input and the content memory fed to the system. I've never really thought about overclocking for anything, but this one might be worth it... is there really any significant performance increase on something like a LLM?

Statistics: Posted by dave xanatos — Tue Feb 25, 2025 2:27 am — Replies 1 — Views 64



Viewing all articles
Browse latest Browse all 4397

Trending Articles