Im using Ollama on my server with the WebUI. It has no GPU so its not quick to reply but not too slow either.

Im thinking about removing the VM as i just dont use it, are there any good uses or integrations into other apps that might convince me to keep it?

  • Possibly linux
    link
    fedilink
    English
    12 months ago

    I have never tested in on Apple silicon but it works fine on my laptop

    • minnixA
      link
      English
      12 months ago

      What are your laptop specs?

        • minnixA
          link
          English
          2
          edit-2
          2 months ago

          CPU is only one factor regarding specs, a small one at that. What kind of t/s performance are you getting with a standard 13B model?

          • Possibly linux
            link
            fedilink
            English
            12 months ago

            I don’t have enough ram to run a 13b. I just stick to Mistral 7b and it works fine.