Im using Ollama on my server with the WebUI. It has no GPU so its not quick to reply but not too slow either.

Im thinking about removing the VM as i just dont use it, are there any good uses or integrations into other apps that might convince me to keep it?

      • minnixA
        link
        fedilink
        English
        arrow-up
        2
        ·
        edit-2
        5 months ago

        CPU is only one factor regarding specs, a small one at that. What kind of t/s performance are you getting with a standard 13B model?