Skip Navigation

Simon Willisonโ€™s LLM CLI tool now supports self-hosted language models via plugins

LLM is my command-line utility and Python library for working with large language models such as GPT-4. I just released version 0.5 with a huge new feature: you can now install plugins that add support for additional models to the tool, including models that can run on your own hardware.

@AutoTLDR

1 comments
  • TL;DR: (AI-generated ๐Ÿค–)

    The text announces the release of version 0.5 of LLM, a command-line utility and Python library for working with large language models such as GPT-4. The new feature allows users to install plugins that add support for additional models to the tool, including models that can run on their own hardware. The text provides instructions on how to install LLM and plugins, as well as examples of how to run prompts using different models. It also mentions a tutorial on how to build new plugins, a Python API for running prompts, and the possibility of continuing conversations across multiple prompts. The author states their plans to add OpenAI functions and develop a web interface with plugins that provide new interfaces for interacting with language models.