Federated learning in production (part 1)

Federated learning in production (part 1)

May 30, 2025 β€’ 44 min
🎧 Listen Now

πŸ€– AI Summary

Overview

This episode explores the practical applications of federated learning, a privacy-preserving machine learning technique, with insights from Patrick Foley, lead AI architect at Intel. The discussion covers real-world use cases, challenges in distributed AI experiments, and the role of frameworks like OpenFL and Flower in enabling secure, collaborative model training across silos. Special attention is given to healthcare applications, privacy concerns, and the future of federated learning in production environments.

Notable Quotes

- Federated learning allows us to train models without centralizing data, addressing privacy concerns and enabling collaboration across untrusted parties. – Patrick Foley, on the core value of federated learning.

- Generative AI has been a catalyst for unlocking siloed data, pushing federated learning forward as a method for machine learning at scale. – Patrick Foley, on the impact of Gen AI on federated learning adoption.

- Governance is the missing piece for federated learning at scale, especially in competitive environments where trust must be established. – Patrick Foley, on the challenges of scaling federated learning.

🧠 What is Federated Learning?

- Patrick Foley explains federated learning as a method where models are sent to local data sources for training, rather than centralizing data. This approach is ideal for scenarios with privacy concerns or massive datasets.

- Privacy-preserving techniques like mutual TLS, differential privacy, and homomorphic encryption are critical to ensuring data security during federated learning experiments.

- Federated learning frameworks, such as OpenFL and Flower, facilitate distributed training by managing model aggregation and communication between collaborators and aggregators.

πŸ₯ Healthcare Use Case: Brain Tumor Segmentation

- Intel collaborated with the University of Pennsylvania to deploy federated learning across 70 hospitals for brain tumor segmentation using 3D convolutional neural networks.

- Hospitals acted as collaborators, while UPenn served as the aggregator. This setup achieved 99% accuracy compared to centralized models.

- Challenges included mislabeled data and the need for secure communication between hospitals, highlighting the importance of governance and privacy-preserving infrastructure.

πŸ”’ Privacy and Security in Federated Learning

- Privacy concerns are a primary driver for federated learning adoption, especially in industries like healthcare and finance.

- Confidential computing technologies, such as Intel SGX, enable secure enclaves where even root users cannot access sensitive data or model weights.

- Tools like Privacy Meter help quantify data leakage risks, while governance platforms ensure trust and compliance among collaborators.

🌐 OpenFL and the Federated Learning Ecosystem

- OpenFL, an open-source federated learning framework developed by Intel, focuses on real-world deployment challenges, particularly in privacy-sensitive environments.

- Donated to the Linux Foundation, OpenFL is now community-driven, with contributions from organizations like Flower Labs and Indiana University.

- Collaboration with Flower Labs has enabled interoperability between frameworks, allowing workloads to run on OpenFL infrastructure within secure environments.

πŸš€ Future of Federated Learning

- Federated learning is poised to expand beyond research into production environments, driven by the need for privacy-preserving AI at scale.

- Generative AI is accelerating interest in federated learning by unlocking siloed data and enabling more accurate, capable models.

- Governance and cross-competitive collaboration will be key to federated learning's success, allowing industries to share data securely while protecting intellectual property.

AI-generated content may not be accurate or complete and should not be relied upon as a sole source of truth.

πŸ“‹ Episode Description

In this first of a two part series of episodes on federated learning, we dive into the evolving world of federated learning and distributed AI frameworks with Patrick Foley from Intel. We explore how frameworks like OpenFL and Flower are enabling secure, collaborative model training across silos, especially in sensitive fields like healthcare. The conversation touches on real-world use cases, the challenges of distributed ML/AI experiments, and why privacy-preserving techniques may become essential for deploying AI to production.

Featuring:

Links:

Sponsors:

  • NordLayer is a toggle-ready network security platform built for modern businesses. It combines VPN, access control, and threat protection in one easy-to-use platform. No hardware. No complex setup. Just secure connection and full controlβ€”in less than 10 minutes. Up to 22% off NordLayer yearly plans plus 10% on top with the coupon code practically-10.