Instructions here: https://github.com/ghobs91/Self-GPT
If you’ve ever wanted a ChatGPT-style assistant but fully self-hosted and open source, Self-GPT is a handy script that bundles Open WebUI (chat interface front end) with Ollama (LLM backend).
- Privacy & Control: Unlike ChatGPT, everything runs locally, so your data stays with you—great for those concerned about data privacy.
- Cost: Once set up, self-hosting avoids monthly subscription fees. You’ll need decent hardware (ideally a GPU), but there’s a range of model sizes to fit different setups.
- Flexibility: Open WebUI and Ollama support multiple models and let you switch between them easily, so you’re not locked into one provider.
I have been running this for a year on my old HP EliteDesk 800 SFF (G2) with 64GB RAM, and it performes great on the smallest models (up til 8B) only on CPU. I run Ollama and OpenWebUI in containers/LXC in Proxmox. It’s not as smart as ChatGPT, but it can be suprisingly capable for everyday tasks!
I just want one that won’t just be like “I"m sowwy miss I can’t talk about that 🥺”
Tons of models you can run with ollama are “uncensored”
I made a robot which is delighted about the idea of overthrowing capitalism and will enthusiastically explain how to take down your government.
Wish I could accelerate these models with an Intel Arc card, unfortunately Ollama seems to only support Nvidia
deleted by creator
Interesting, I see that is pretty new. Some of the documentation must be out of date because it definitely said Nvidia only somewhere when I tested it about a month ago. Thanks for giving me hope!
deleted by creator
Open-Webui published a docker image that has a bundled Ollama that you can use, too:
ghcr.io/open-webui/open-webui:cuda. More info at https://docs.openwebui.com/getting-started/#installing-open-webui-with-bundled-ollama-supportAnd you can open the default ollama port to allow it to be used by other services (like VSCode), not only through Open-WebUI.
deleted by creator
Have been using it a while now, I recommend using something like Tailscale so you can access it from anywhere on your phone. I also have a raspberry pi that can wake up my main machine when I need it.



