FedCCA: Federated Canonical Correlation Analysis
Zhengquan Luo · Kai Fong Ernest Chong · Pengfei Wei · Changyou Chen · Peilin Zhao · Renmin Han · Chunlai Zhou · Yunlong Wang · Zhiqiang Xu
Abstract
Canonical Correlation Analysis (CCA) is a key tool for cross-modal learning, but centralized solutions are impractical due to the heavy cost of high-dimensional covariance operations and the privacy sensitivity of distributed data. To address these challenges, we propose FedCCA, a federated framework that replaces explicit inverses and inner least-squares solves with a truncated von Neumann series, reducing matrix inversions to lightweight matrix–vector multiplications while retaining provable convergence. This series formulation not only improves efficiency, but also provides explicit and tunable control of truncation error, and its structure naturally splits into client-side multiplications and a server-side projection step, making it particularly suitable for federated deployment. Building on this foundation, we incorporate Gaussian differential privacy and derive practical upper and lower bounds on the required noise variance, which yield end-to-end $(\varepsilon,\delta)$ guarantees together with convergence stability. Empirical results on five datasets confirm that FedCCA achieves accuracy comparable to centralized CCA and consistently outperforms ALS/TALS baselines in both sub-optimality gap and convergence speed, all while maintaining rigorous privacy protection.
Successful Page Load