FedRPO: Federated relaxed Pareto optimization for acoustic event classification
2023
Performance and robustness of real-world Acoustic Event Classification (AEC) solutions depend on ability to train on diverse data from wide range of end-point devices and acoustic environments. Federated Learning (FL) provides a framework to leverage annotated and non-annotated AEC data from servers and client devices in a privacy preserving manner. In this work we propose a novel Federated Relaxed Pareto Optimization (FedRPO) method for semisupervised FL with heterogeneous client data. In contrast to federated averaging class of FL algorithms (fedAvg) that perform unconstrained weighted aggregation across all data sources, FedRPO enables special treatment of data with high quality annotations vs. data with pseudo-labels of unknown, varying qualities. In particular, FedRPO computes the updates to the global model solving a constrained linear program, with explicit Pareto constraints to prevent performance degradation on annotated data, and controlled relaxation of the Pareto constraints on pseudo-labeled data to prevent learning of patterns in conflict with the annotated data. We show FedRPO significantly outperforms FedAvg on Amazon internal de-identified dataset on AEC tasks. On supervised learning, FedRPO improved precision by 32.5% over FedAvg when maintaining recall at 90%. Combined with FixMatch [1] for semi-supervised learning, FedRPO outperformed FedAvg on precision by 50.5% at 90% recall.
Research areas