• Quazatron@lemmy.world
      link
      fedilink
      arrow-up
      4
      ·
      23 hours ago

      Lookup Alpaca and Ollama. If you are using Linux they are just a Flatpak away.

      If not, you can go with Ollama in docker format with a Open-WebUI frontend.

      The model I used was Llama3.2 and basically told it to simulate GlaDOS.

    • trolololol@lemmy.world
      link
      fedilink
      arrow-up
      2
      ·
      1 day ago

      You can also just tell your favorite one to do that, if that’s what you’re after or have a really bad GPU.

      LM studio is the most stable and user friendly that I’ve found by far, but try to download a model that fits inside your GPU or else the model will be super slow or crash.