
text-generation-webui
Local LLM interface with text, vision, and training
Coldcast Lens
Text-generation-webui is the Swiss Army knife for running LLMs locally. One web interface, every backend — llama.cpp, ExLlamaV2, Transformers, AutoGPTQ. Load a model, chat with it, fine-tune it, run it as an API. It's the Gradio-powered cockpit for local AI.
If you want to experiment with open-weight models without touching a command line, this is your starting point. It supports model quantization, LoRA training, multimodal input, and OpenAI-compatible API endpoints. Ollama is simpler but less flexible — great for quick inference, not for training. LM Studio has a prettier UI but is closed-source. vLLM is faster for production serving but has no UI.
Best for tinkerers and indie hackers who want full control over their AI stack without cloud API bills.
The catch: it's AGPL-3.0, so building a commercial product on top requires care. Setup can be finicky — GPU drivers, CUDA versions, and Python dependencies love to conflict. And performance won't match purpose-built inference servers like vLLM or TGI for production workloads.
About
- Stars
- 46,358
- Forks
- 5,901
Explore Further
More tools in the directory
Get tools like this delivered weekly
The Open Source Drop — the best new open source tools, analyzed. Free.