A Geometric Algorithm for Contrastive Principal Component Analysis in High Dimension

Rung Sheng Lu, Shao Hsuan Wang, Su Yun Huang

Research output: Contribution to journalArticlepeer-review

Abstract

Principal component analysis (PCA) has been widely used in exploratory data analysis. Contrastive PCA (Abid et al.), a generalized method of PCA, is a new tool used to capture features of a target dataset relative to a background dataset while preserving the maximum amount of information contained in the data. With high dimensional data, contrastive PCA becomes impractical due to its high computational requirement of forming the contrastive covariance matrix and associated eigenvalue decomposition for extracting leading components. In this article, we propose a geometric curvilinear-search method to solve this problem and provide a convergence analysis. Our approach offers significant computational efficiencies. Specifically, it reduces the time complexity from (Formula presented.) to a more manageable (Formula presented.), where n, m are the sample sizes of the target data and background data, respectively, p is the data dimension and r is the number of leading components. Additionally, we streamline the space complexity from (Formula presented.), necessary for storing the contrastive covariance matrix, to a more economical (Formula presented.), sufficient for storing the data alone. Numerical examples are presented to show the merits of the proposed algorithm. Supplementary materials for this article are available online.

Original languageEnglish
JournalJournal of Computational and Graphical Statistics
DOIs
StateAccepted/In press - 2024

Keywords

  • Cayley retraction mapping
  • Contrastive PCA
  • Curvilinear-search
  • High dimension
  • Principal component analysis
  • Projected gradient
  • Stiefel manifold

Fingerprint

Dive into the research topics of 'A Geometric Algorithm for Contrastive Principal Component Analysis in High Dimension'. Together they form a unique fingerprint.

Cite this