yes, the vscode extension is a one click install, so is ollama which is a separate project that provides local inference
you'll then have to download a model, which ollama makes very easy. choosing which one will depend on your hardware but the biggest QwenCoder2.5 you can fit is a very solid starting place.
it's not ready for your grandma, but it's easy enough that I'd trust a junior dev to be able to get it done
Does this run with VSCode and how hard is it to set this up?