Previous Achievements of the Summer School
Over the years, the Federated Learning Summer School has successfully brought together leading researchers, industry experts, and students.
- High-impact keynote talks on Federated Learning, Edge AI, and Data Privacy.
- Hands-on workshops and coding sessions with state-of-the-art FL frameworks.
- Collaborative research projects in security, healthcare, and IoT-based FL.
- Networking opportunities for students, researchers, and industry leaders.
Federated Learning Summer School Statistics
Participants Breakdown
Summer School Outcomes
Handling Non-IID Data in Federated Learning: An Experimental Evaluation Towards Unified Metrics
This study surveys strategies for handling Non-IID data in federated learning and introduces a new metric to assess data skew without accessing client data. The proposed metric aids in selecting effective strategies, enhancing both research and real-world applications.
Read MoreMPCFL: Towards Multi-party Computation for Secure Federated Learning Aggregation
This study introduces MPCFL, a secure Federated Learning algorithm using multi-party computation and secret sharing to prevent data leakage. Evaluated on benchmarks, it enhances security and lays the foundation for robust privacy-preserving FL aggregation techniques.
Read MoreTowards Accelerating the Adoption of Federated Learning for Heterogeneous Data
This study examines Federated Machine Learning (FML) for addressing data heterogeneity, privacy, and ownership challenges. It integrates the FEDMA algorithm and evaluates it on the FEMNIST dataset to simulate real-world heterogeneous AI scenarios.
Read MoreData Skew in Federated Learning: An Experimental Evaluation on Aggregation Algorithms
This study examines data skew challenges in Federated Learning, particularly for facial ethnicity classification with non-IID data. It evaluates FL aggregation algorithms and introduces an adaptive method to enhance model robustness, fairness, and privacy across diverse applications.
Read MoreBayesian Federated Learning with Stochastic Variational Inference
This study introduces Bayesian Federated Learning with Stochastic Variational Inference (BayFL-SVI) to improve non-IID data handling and model aggregation. It enhances accuracy and convergence rates, offering a strong foundation for future FL research and optimization.
Read MoreFedPROM: A Zero-Trust Federated Learning Approach with Multi-Criteria Client Selection
This study introduces FedPROM, an MCDM-based framework using the PROMETHEE method to optimize client selection in Federated Learning. It enhances convergence speed and accuracy, improving FL efficiency in resource-constrained environments.
Read More