Fair, I use Open WebUI + Ollama personally but it’s slightly tricky to set up, wasn’t aware there were open source options with a built in model browser and hardware compatibility estimates
You can also just tell your favorite one to do that, if that’s what you’re after or have a really bad GPU.
LM studio is the most stable and user friendly that I’ve found by far, but try to download a model that fits inside your GPU or else the model will be super slow or crash.
I’m not very familiar with LLMs. How do you install a local copy?
Lookup Alpaca and Ollama. If you are using Linux they are just a Flatpak away.
If not, you can go with Ollama in docker format with a Open-WebUI frontend.
The model I used was Llama3.2 and basically told it to simulate GlaDOS.
LM Studio is probably the easiest way
No, promote open-source platforms; LMS is closed-source. Try https://jan.ai/ instead.
Fair, I use Open WebUI + Ollama personally but it’s slightly tricky to set up, wasn’t aware there were open source options with a built in model browser and hardware compatibility estimates
Thanks Ollama
You can also just tell your favorite one to do that, if that’s what you’re after or have a really bad GPU.
LM studio is the most stable and user friendly that I’ve found by far, but try to download a model that fits inside your GPU or else the model will be super slow or crash.