We detected 356 companies using LiteLLM. The most common industry is Software Development (23%) and the most common company size is 11-50 employees (30%). We find new customers by discovering internal subdomains and certificate transparency logs.
๐ Who usually uses LiteLLM and for what use cases?
Source: Analysis of job postings that mention LiteLLM (using the Bloomberry Jobs API)
Job titles that mention LiteLLM
i
Based on an analysis of job titles from postings that mention LiteLLM.
Job Title
Share
Backend Engineer
31%
Machine Learning Engineer
21%
AI Engineer
18%
DevOps/Platform Engineer
12%
I found that Backend Engineers (31%) and Machine Learning Engineers (21%) dominate LiteLLM adoption, followed by AI Engineers (18%) and DevOps/Platform Engineers (12%). The buyers are primarily engineering leaders building AI infrastructure at scale. Directors of AI Development, Heads of Data Science, and VPs of Engineering are hiring teams to create what multiple postings call an "AI Platform" or "AI Gateway" that centralizes model access, cost control, and observability. Their strategic priority is moving from AI experiments to production-grade systems that serve thousands of users.
Day-to-day users are hands-on engineers implementing LiteLLM as their routing layer for multi-model architectures. They're building RAG pipelines, agentic workflows, and LLM-powered applications while using LiteLLM to manage "model routing, cost control, load balancing, and failover mechanisms." Practitioners integrate it with frameworks like LangChain and LangGraph, implement "caching, rate/cost optimization, and governance across Azure OpenAI and other providers," and ensure their AI services are "scalable, secure, and high-performance."
The core pain point is operational complexity at scale. Companies want to "optimize inference costs," achieve "reliability, scalability, and performance," and provide "secure and scalable usage of AWS services" for AI workloads. One posting emphasized building "the foundational systems that enable self-service AI-powered software engineering" while another sought engineers to "drive cost optimization: model selection, caching, token budgeting, and request batching at scale." These organizations are moving beyond prototypes to enterprise-grade AI infrastructure that balances innovation speed with governance and cost control.
๐ฅ What types of companies use LiteLLM?
Source: Analysis of Linkedin bios of 356 companies that use LiteLLM
Company Characteristics
i
Shows how much more likely LiteLLM customers are to have each trait compared to all companies. For example, 2.0x means customers are twice as likely to have that characteristic.
Trait
Likelihood
Industry: Information Technology & Services
13.7x
Industry: Software Development
13.6x
Funding Stage: Seed
11.6x
Industry: IT Services and IT Consulting
8.6x
Country: Brazil
7.0x
Company Size: 1,001-5,000
6.4x
I noticed something surprising analyzing these 38 LiteLLM users: there's no single typical customer profile. These companies span an enormous range, from a 3-person French nonprofit promoting ecological transition to TAL Education Group, a $3.3B publicly-traded Chinese edtech giant with 10,000+ employees. They include furniture retailers in Poland, cable manufacturers in China, broadband providers in Oklahoma, and biotechnology incubators in Cambridge.
The maturity levels vary wildly. I see seed-stage startups like 1DigitalStack ($1.4M raised) and 20Seconds ($110K angel funding) alongside post-IPO giants and established family businesses like Gustav Selter GmbH, now in its 6th generation since 1829. Employee counts range from 1 to 10,000+. Many have no disclosed funding at all, suggesting bootstrapped operations or private companies that don't share financial data publicly.
๐ง What other technologies do LiteLLM customers also use?
Source: Analysis of tech stacks from 356 companies that use LiteLLM
Commonly Paired Technologies
i
Shows how much more likely LiteLLM customers are to use each tool compared to the general population. For example, 287x means customers are 287 times more likely to use that tool.
I noticed that companies using LiteLLM are building serious AI infrastructure with a strong emphasis on observability and operational control. The combination of Langfuse for LLM monitoring, Grafana for metrics, and Argo CD for deployments tells me these are engineering-led organizations treating AI as production infrastructure, not experimental side projects. They're likely product-led companies building AI features into their core offerings, or they're selling AI automation tools directly to other businesses.
The pairing with N8N is particularly revealing. N8N is a workflow automation platform, which suggests these companies are either building AI-powered automation products or using LiteLLM to add intelligence to existing workflow tools. The extremely high correlation with Langfuse makes perfect sense because when you're routing LLM calls through a proxy like LiteLLM, you need deep observability into costs, latency, and quality. Meanwhile, Argo CD appearing so frequently indicates these teams are running Kubernetes-based deployments and treating their AI infrastructure with the same GitOps rigor as their other services.
The full stack screams product-led growth with technical buyers. These companies need AWS for scalable infrastructure, they're deploying continuously with Argo CD, and they're monitoring everything with Grafana. This isn't a sales-led motion where you're buying enterprise software through procurement. These are engineering teams evaluating tools, running proof of concepts, and making bottom-up adoption decisions. They're likely Series A to Series C startups or forward-thinking engineering teams at larger companies, mature enough to need production-grade observability but still moving fast enough to adopt newer tools.
Alternatives and Competitors to LiteLLM
Explore vendors that are alternatives in this category