Home
Companies
Flower

Flower

Train AI on distributed data

Flower is an open-source framework for training AI on distributed data using federated learning. Companies like Banking Circle, Nokia, Porsche, and Brave use Flower to easily improve their AI models on sensitive data that is distributed across organizational silos or user devices. Almost all AI today is based on centralized public data — a small fraction of the data we have; we believe that training on orders of magnitude more data will unlock the next leaps in AI.

Flower
Founded:2020
Team Size:5
Location:
Group Partner:Nicolas Dessaigne

Active Founders

Daniel J. Beutel

Daniel is one of the creators of Flower, the first fully agnostic federated learning framework, which is now being used at many Fortune 500 companies and most top universities worldwide. He previously held roles as Head of AI and CTO and has considerable experience in running and scaling engineering teams. Daniel is a CS PhD candidate at the University of Cambridge and has an MSc (with distinction) in Software Engineering from the University of Oxford.

Daniel J. Beutel
Daniel J. Beutel
Flower

Nic Lane

Nic Lane is a co-founder of Flower Labs, and has been working on Flower since its creation. Nic is also a full Professor in Machine Learning Systems at the University of Cambridge . Previously, he has been on the faculty of Oxford and UCL. Nic additionally has 12-years experience in industrial research (including Microsoft Research and Nokia Bell Labs). Most recently as co-founder and Head of the Samsung AI Center in Cambridge; a 50-staff AI lab started in 2018.

Nic Lane
Nic Lane
Flower

Taner Topal

Taner is one of the creators of Flower, the first fully agnostic federated learning framework, which is now being used at many Fortune 500 companies and most top universities worldwide. He previously held roles as Head of Engineering and CTO in two prior startups. Some companies using products he helped build include Porsche, Lufthansa, and Vattenfall. Taner is also a Visiting Researcher at the University of Cambridge.

Taner Topal
Taner Topal
Flower

Company Launches

Hi everyone - we're Daniel, Taner, and Nic, and we're excited to introduce Flower:

TL;DR 🌼 Flower is an open-source framework for training AI on distributed data using federated learning. Companies like Brave, Banking Circle, and Nokia use us to improve their models with sensitive data that they could not leverage before.

→ Before you continue: give us a star on GitHub ⭐️

Problem: What's holding AI back?

Almost all of the AI breakthroughs you know of — from ChatGPT and Google Translate to DALL·E and Stable Diffusion — were trained with public data available on the web. Conventional machine learning needs all data collected in a central place, the motto has always been to “move the data to the computation. This approach has given us some impressive results, but we are seeing those advances struggle to make their way into other important areas, like healthcare.

Why is that? The reason is simple: most data is sensitive and distributed across organizational silos or user devices. It cannot be collected in a central place, which prevents us from training state-of-the-art models for many important use cases:

  • 👩‍🎨 Generative AI: Many scenarios require sensitive data that users or organizations are reluctant to upload to the cloud.
  • ❤️‍🩹 Healthcare: We have excellent AI model architectures, and we could train cancer detection models better than any doctor in the world, but no single organization has enough data
  • 🏦 Finance: Preventing financial fraud is hard because individual banks face data regulations, and in isolation, they don't have enough fraud cases to train good models
  • 🏎️ Automotive: Autonomous driving could transform our lives, but, again, individual car makers struggle to gather the data to cover the long tail of possible edge cases
  • 💻 Personal computing: Users don't want certain kinds of data to be stored in the cloud, hence the recent success of privacy-enhancing alternatives like the Signal messenger or the Brave browser
  • 📚 Foundation models: The more and the more diverse data we have to train foundation models, the better they generalize. But again, most data is sensitive and thus can't be incorporated as these models continue to grow bigger and need more information.

Solution: Federated learning (and other PETs)

Federated learning can train AI models on distributed and sensitive data by moving the training to the data (instead of moving the data to the training); it just collects the insights from the learning process, and the data stays where it is. And because the data never moves, we can train AI on sensitive data spread across organizational silos or user devices to improve models with data that could never be leveraged until now.

This is, of course, more challenging: we must move AI models to data silos or user devices, train locally, send updated models back, aggregate them, and repeat. Flower provides the open-source infrastructure to easily use federated learning (and other privacy-enhancing technologies - or short: PETs) with all the tools you know and love today - PyTorch, TensorFlow, JAX, Hugging Face, fastai, Weights & Biases - just bring your existing project, and easily “federate” it using Flower.

Asks: How can you help?