Ultra-fast inference on custom LPU hardware. Extremely low latency for supported models.
Groq is an inference provider. All models listed here can be accessed through Groq via OpenRouter's unified API.
Groq currently offers 318 AI models accessible via API. This includes various sizes and specializations to fit different use cases and budgets.
Pricing varies by model. See our pricing table above sorted by cost — the cheapest option is shown at the top. Some models may have free tiers available.
Use our provider comparison tool to see how Groq's pricing stacks up against competitors like OpenAI, Anthropic, and Google across similar model tiers.