Suppose I’ve written code that calls the OpenAI API. Is there some library that helps me easily switch to a local/other LLM. I.e a library that (ideally) provides the same OpenAI interface for several models, or if not then at least the same interface.
OpenLLM plan to provide an OpenAI-compatible API, which allows you to even use OpenAI's python client to talk to OpenLLM, user just need to change to Base URL to point to your OpenLLM server. This feature is working-in-progress.