Hi all, I’d like to hear some suggestions on self hosting LLMs on a remote server, and accessing said LLM via a client app or a convenient website. Either hear about your setups or products you got good impression on.

I’ve hosted Ollama before but I don’t think it’s intented for remote use. On the other hand I’m not really an expert and maybe there’s other things to do like add-ons.

Thanks in advance!

  • EmbarrassedDrum@lemmy.dbzer0.comOP
    link
    fedilink
    English
    arrow-up
    1
    ·
    29 days ago

    No, but I have free instance on Oracle Cloud and that’s where I’ll run it. If it’s too slow or no good I’ll stop using it but there’s no harm trying.

    • ddh@lemmy.sdf.org
      link
      fedilink
      English
      arrow-up
      2
      ·
      25 days ago

      I’d be interested to see how it goes. I’ve deployed Ollama plus Open WebUI on a few hosts and small models like Llama3.2 run adequately (at least as fast as I can read) on even an old i5-8500T with no GPU. Oracle Cloud free tier might work OK.