📦

Deploy Litellm

Call all LLM APIs using the OpenAI format. Use Bedrock, Azure, OpenAI, Cohere, Anthropic, Ollama, Sagemaker, HuggingFace, Replicate, Groq (100+ LLMs)

AI3 services

Stack Overview

Services
3
Total CPU
1 Core
Total Memory
1.0 GB
weblitellm
ghcr.io/berriai/litellm-database:main-stable
workerpostgres
postgres:16-alpine
workerredis
redis:7-alpine

Deploy Litellm on TinyPod

$5/server/mo

Pre-configured. One-click deploy. No DevOps.

Deploy Litellm

Why Self-Host Litellm?

Litellm is a pre-configured ai stack that you can deploy to your own server in seconds with TinyPod. The template includes 3 services— litellm, postgres, redis — all wired together and ready to use.

Self-hosting means your data stays on your server, you control updates and access, and there are no per-user fees or vendor lock-in. TinyPod handles the infrastructure so you can focus on using the software.

How to Deploy Litellm

1

Create a TinyPod account

Sign up for free and get a 7-day trial with full access.

2

Select the Litellm template

Find Litellm in our template library and click to configure.

3

Customize resources

Adjust CPU and RAM for each service, or keep the defaults. Name your project and hit deploy.

4

Stack is live

All 3 services start together with networking pre-configured. Get a free subdomain with automatic SSL.

Similar AI Templates