Im using Ollama on my server with the WebUI. It has no GPU so its not quick to reply but not too slow either.

Im thinking about removing the VM as i just dont use it, are there any good uses or integrations into other apps that might convince me to keep it?

  • minnixA
    link
    English
    -23 months ago

    Ollama without a GPU is pretty useless unless you’re using with Apple silicon. I’d just get rid of it until you get a GPU.

    • Possibly linux
      link
      fedilink
      English
      13 months ago

      I have never tested in on Apple silicon but it works fine on my laptop

      • minnixA
        link
        English
        13 months ago

        What are your laptop specs?

          • minnixA
            link
            English
            2
            edit-2
            3 months ago

            CPU is only one factor regarding specs, a small one at that. What kind of t/s performance are you getting with a standard 13B model?

            • Possibly linux
              link
              fedilink
              English
              13 months ago

              I don’t have enough ram to run a 13b. I just stick to Mistral 7b and it works fine.