We are! There’s a bunch of features we’d like to add, and for the most part we expect to be moving on to other projects (so no promises on when we’ll get to it), but we do absolutely want to add support for other models.
I’d like to be able to try it out with locally hosted server endpoints, and those are OpenAI-compatible (as generally are open-source model providers), so probably the quickest to implement if I’m not missing something about the networking.
We are! There’s a bunch of features we’d like to add, and for the most part we expect to be moving on to other projects (so no promises on when we’ll get to it), but we do absolutely want to add support for other models.
I’d like to be able to try it out with locally hosted server endpoints, and those are OpenAI-compatible (as generally are open-source model providers), so probably the quickest to implement if I’m not missing something about the networking.