If I'm writing a new lightweight application that requires LLM-based text completion to power a feature, is there a standard way to request the user's operating system to provide a completion?
For instance, imagine I'm writing a small TUI that allows you to browse jsonl files, and want to create a feature to enable natural language parsing. Is there an emerging standard for an implementation agnostic, "Translate this natural query to jq {natlang-query}: response here: "?
If we don't have this yet, what would it take to get this built and broadly available?
This is of course cross-platform and works with both models accessible through an API and local ones.
I'm afraid this might not solve your problem though, as this is not an out of the box solution, it requires the user to either provide their own API key or to install Ollama and wire it up on their own.
reply