Compare tools for running AI models locally on your own hardware. From desktop apps to inference frameworks — run LLMs privately without cloud APIs.
Desktop app for discovering, downloading, and running local LLMs. Beautiful UI with built-in model search and chat interface.
Open-source ChatGPT alternative that runs 100% offline. Desktop app with local model management and OpenAI-compatible API.
The foundational C/C++ inference engine that powers most local LLM tools. Highly optimized for CPU and GPU inference with quantization support.
Self-hosted web interface for running LLMs. Supports Ollama and OpenAI-compatible backends. Extensible with plugins and tools.
We track 8 local llm runners tools. Ollama is among the most popular options. See our full directory above for pricing, features, and open-source status of each tool.
Yes, several local llm runners offer free tiers or are fully open-source. Check the pricing column in our directory to find free and open-source options.
Consider your specific needs: budget (free vs paid), whether you need open-source, integration requirements, and the specific AI models you want to use. Our directory includes feature highlights to help you compare.