Hacker News new | past | comments | ask | show | jobs | submit login

I was hoping I could run my LLM CLI tool against Ollama via their localhost API, but it looks like they don't offer an OpenAI-compatible endpoint yet.

If they add that it will work out of the box: https://llm.datasette.io/en/stable/other-models.html#openai-...

Otherwise someone would need to write a plugin for it, which would probably be pretty simple - I imagine it would look a bit like the llm-mistral plugin but adapted for the Ollama API design: https://github.com/simonw/llm-mistral/blob/main/llm_mistral....




Consider applying for YC's Summer 2025 batch! Applications are open till May 13

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: