Preferences

Hang on, starting benchmarks on my Raspberry Pi.

By the year 2035, toasters will run LLMs.
On a lark a friend setup Ollama on a 8GB Raspberry Pi with one of the smaller models. It worked by it was very slow. IIRC it did 1 token/second.

This item has no comments currently.