HomeCompaniesProxis

Proxis

The platform for enterprise AI agent automations, starting with email.

Proxis is the AI automation platform built for enterprises. Proxis is creating AI email agents that empower brands and enterprise teams in the creator economy to close deals automatically. Think: your inbox, on autopilot. We initially fine tuned and distilled LLMs for enterprise use—and quickly realized that companies need more than just smarter models. They need full-stack automation. So Proxis built AI agents that handle the tedious, time-consuming work—everything from autonomous sales closers to customer service agents. The mission is bold: to build the world’s most capable team of AI agents for every enterprise. Proxis imagines a future where a single founder can run a billion-dollar company—backed by a custom-built army of agents. Proxis is rethinking how agents are built, trained, and deployed from the ground up.
Active Founders
Liam Collins
Liam Collins

Liam Collins, Founder

CEO and cofounder of Proxis. I built zero-to-one systems as a software engineer for two years and prior to that was an investment banker in renewable energy. I received my bachelors from Columbia University and City University of Hong Kong (on a Dual Degree), and left my MBA at Wharton to build Proxis.
Proxis
Founded:2024
Batch:Summer 2024
Team Size:1
Status:
Active
Primary Partner:David Lieb
Company Launches
Proxis - the first dedicated platform for LLM distillation and serving
See original launch post

Tl;dr: Proxis is the first dedicated platform for LLM distillation and serving, unlocking production ready models at 1/10th of the cost.

The Problem

  • Fine-tuning frontier models like GPT-4 on proprietary data is expensive, and locks customers into one external closed source model provider.
  • These large models are 100x the price per-token of smaller Llama 3.1 models, and run nearly 10x slower. This makes them impractical to deploy at scale.
  • Closed source model lock-in means customers can’t tweak or tune their own model, or deploy on-prem for sensitive data solutions.

The Solution

Model Distillation:

  • In model distillation, a large teacher model effectively fine tunes a smaller student model without the need for labeled or structured datasets. This ‘condenses’ large models down to the compute cost of small models.
  • Distillation results in near-frontier model quality, with efficient performance. This means we can achieve 5x the speed at 1/10th the cost.
  • With the release of Llama 405b in late July, open source frontier distillation became available to the public for the first time.

Growth & Market

  • Llama model downloads have grown 10x in the last year and are approaching 350 million downloads to date. There were 20 million downloads in the last month alone, making Llama the leading open source model family.
  • Llama usage by token volume across major cloud service providers has more than doubled in just three months from May through July 2024 when Llama 3.1 was released.
  • Monthly usage (token volume) of Llama grew 10x from January to July 2024 for some of the largest cloud service providers.

Source: Meta

The Team

Jackson (CTO, on the left) optimized the Gemini model at Google for efficient deployment at massive scale. Liam (CEO, on the right) built zero-to-one systems as a software engineer.

The Ask

Give it a go! Sign up to our waitlist here to access Proxis-hosted Llama models at a cheaper cost than current offerings.

YC Sign Photo

Company photo