Ask HN: Favorite setup for self-hosted code assistants?

3 points by strangecasts 2 days ago

Mainly programming Python, using Visual Studio Code on Windows 10 (with WSL set up). Model-wise I have a RTX 4070 with 12G VRAM (mostly due to Mini-ITX size restrictions)

I understand self-hosted models don't really compare to Copilot/Claude, but I have projects which rule out the use of external services, and I would appreciate having a solution which works offline and also allows tinkering with the models.

Most of the existing plugins seem to be thin shells over Ollama, which is nice and makes setup straightforward, but also means there are many options to choose from - which solutions do people like?

Continue.dev seems closest to the thing I want, but I ran into an issue with its Ollama integration where it would exhaust the TCP connection pool(?) on starting up the server :(

beeburrt 14 hours ago

Oooh, this is a good question. As someone always looking for new and/or better ways of learning, this idea is appealing. Sorry, I don't have an answer; hopefully, you get some good responses. I'm commenting to more easily find this in the future.