Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Ollama is a good one, LM Studio is great for those who are unsure what to do (will help you get a model that fits into your system specs).

If you use open webui(I recommend via docker) you can access your ollama hosted model via the browser on any device on your network. Tailscale will help make that accessible remotely.

I'm currently working on an open source long term memory system designed to work with ollama to help local models be more competitive with the big players, so we are not so beholden to these big companies.





Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: