• ikidd@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    ·
    3 hours ago

    I’ve set up OpenWebUI with the docker containers, which includes Ollama in API mode, and optionally Playwright if you want to add webscraping to your RAG queries. This gives you a ChatJippity format webpage that you can manage your models for Ollama, and add OpenAI usage as well if you want. You can manage all the users as well.

    On top, then you have API support to your own Ollama instance, and you can also configure GPU usage for your local AI if available.

    Honestly, it’s the easiest way to get local AI.

    • Jakeroxs@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 hour ago

      How did you get your nextcloud AI workers to actually function in docker? Im not using the AIO image and I attempted to edit the systemd script provided in the docs but it fails :/ manually running the do command that I modified works until it runs out of time