1

Turning the Curse of Heterogeneity in Federated Learning into a Blessing for Out-of-Distribution Detection

Deep neural networks have witnessed huge successes in many challenging prediction tasks and yet they often suffer from out-of-distribution (OoD) samples, misclassifying them with high confidence. Recent advances show promising OoD detection …

Federated Robustness Propagation: Sharing Adversarial Robustness in Federated Learning

Federated learning (FL) emerges as a popular distributed learning schema that learns a model from a set of participating users without requiring raw data to be shared. One major challenge of FL comes from heterogeneity in users, which may have …

Outsourcing Training without Uploading Data via Efficient Collaborative Open-Source Sampling

We propose a new privacy-preserving learning framework, outsourcing training to cloud without uploading data, which provides more data without injecting noise into gradient or samples.

Trap and Replace: Defending Backdoor Attacks by Trapping Them into an Easy-to-Replace Subnetwork

Deep neural networks (DNNs) are vulnerable to backdoor attacks. Previous works have shown it extremely challenging to unlearn the undesired backdoor behavior from the network, since the entire network can be affected by the backdoor samples. In this …

Resilient and Communication Efficient Learning for Heterogeneous Federated Systems

The rise of Federated Learning (FL) is bringing machine learning to edge computing by utilizing data scattered across edge devices. However, the heterogeneity of edge network topologies and the uncertainty of wireless transmission are two major …

Dynamic Privacy Budget Allocation Improves Data Efficiency of Differentially Private Gradient Descent

Protecting privacy in learning while maintaining the model performance has become increasingly critical in many applications that involve sensitive data. Private Gradient Descent (PGD) is a commonly used private learning framework, which noises …

Efficient Split-Mix Federated Learning for On-Demand and In-Situ Customization

Efficient and federated learning for heterogeneous clients with different memory sizes

Federated Adversarial Debiasing for Fair and Transferable Representations

A distributed domain/group debiasing framework for unsupervised domain adaptation or fairness enhancement.

Data-Free Knowledge Distillation for Heterogeneous Federated Learning

Federated Learning (FL) is a decentralized machine-learning paradigm, in which a global server iteratively averages the model parameters of local users without accessing their data. User heterogeneity has imposed significant challenges to FL, which …

Learning Model-Based Privacy Protection under Budget Constraints

Protecting privacy in gradient-based learning has become increasingly critical as more sensitive information is being used. Many existing solutions seek to protect the sensitive gradients by constraining the overall privacy cost within a constant …