Deploy Litellm
Call all LLM APIs using the OpenAI format. Use Bedrock, Azure, OpenAI, Cohere, Anthropic, Ollama, Sagemaker, HuggingFace, Replicate, Groq (100+ LLMs)
Stack Overview
ghcr.io/berriai/litellm-database:main-stablepostgres:16-alpineredis:7-alpineWhy Self-Host Litellm?
Litellm is a pre-configured ai stack that you can deploy to your own server in seconds with TinyPod. The template includes 3 services— litellm, postgres, redis — all wired together and ready to use.
Self-hosting means your data stays on your server, you control updates and access, and there are no per-user fees or vendor lock-in. TinyPod handles the infrastructure so you can focus on using the software.
How to Deploy Litellm
Create a TinyPod account
Sign up for free and get a 7-day trial with full access.
Select the Litellm template
Find Litellm in our template library and click to configure.
Customize resources
Adjust CPU and RAM for each service, or keep the defaults. Name your project and hit deploy.
Stack is live
All 3 services start together with networking pre-configured. Get a free subdomain with automatic SSL.