Hacker News new | past | comments | ask | show | jobs | submit login

Sounds great.

Does this run with VSCode and how hard is it to set this up?




yes, the vscode extension is a one click install, so is ollama which is a separate project that provides local inference

you'll then have to download a model, which ollama makes very easy. choosing which one will depend on your hardware but the biggest QwenCoder2.5 you can fit is a very solid starting place. it's not ready for your grandma, but it's easy enough that I'd trust a junior dev to be able to get it done


What's the extension name?


Continue, I talk about it at length in the gp post.


Ah, thanks.

I just read the parent post, lol.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: