Spectral Regression: A Regression Framework for Regularized Subspace Learning


Introduction

Spectral methods have recently emerged as a powerful tool for dimensionality reduction and manifold learning. These methods use information contained in the eigenvectors of a data affinity (i.e., item-item similarity) matrix to reveal low dimensional structure in high dimensional data. The most popular manifold learning algorithms include Locally Linear Embedding, Isomap, and Laplacian Eigenmap. However, these algorithms only provide the embedding results of training samples.

There are many extensions of these approaches which try to solve the out-of-sample extension problem by seeking an embedding function in reproducing kernel Hilbert space. A disadvantage of all these approaches is that their computations usually involve eigen-decomposition of dense matrices which is expensive in both time and memory.

Spectral Regression (SR) [1] [2] is a novel regression framework for efficient regularized subspace learning. Some interesting aspects of SR include


Codes


If you find these algoirthms and data sets useful, we appreciate it very much if you can cite our related works:


Papers

  1. Deng Cai, Xiaofei He, and Jiawei Han. "Semi-Supervised Regression using Spectral Techniques", Department of Computer Science Technical Report No. 2749, University of Illinois at Urbana-Champaign (UIUCDCS-R-2006-2749), July 2006. ( pdf )

  2. Deng Cai, Xiaofei He, and Jiawei Han. "Spectral Regression for Dimensionality Reduction", Department of Computer Science Technical Report No. 2856, University of Illinois at Urbana-Champaign (UIUCDCS-R-2007-2856), May 2007. ( pdf )

  3. Deng Cai, Xiaofei He, and Jiawei Han. "SRDA: An Efficient Algorithm for Large Scale Discriminant Analysis", Department of Computer Science Technical Report No. 2857, University of Illinois at Urbana-Champaign (UIUCDCS-R-2007-2857), May 2007. ( pdf )

  4. Deng Cai, Xiaofei He, and Jiawei Han. "Isometric Projection", Proc. 22nd Conference on Artifical Intelligence (AAAI'07), Vancouver, Canada, July 2007. ( pdf )

  5. Deng Cai, Xiaofei He, and Jiawei Han. "Efficient Kernel Discriminant Analysis via Spectral Regression", Department of Computer Science Technical Report No. 2888, University of Illinois at Urbana-Champaign (UIUCDCS-R-2007-2888), August 2007. ( pdf )

  6. Deng Cai, Xiaofei He, and Jiawei Han. "Spectral Regression: A Unified Subspace Learning Framework for Content-Based Image Retrieval", ACM Multimedia 2007, Augsburg, Germany, Sep. 2007. ( pdf )

  7. Deng Cai, Xiaofei He, and Jiawei Han. "Spectral Regression for Efficient Regularized Subspace Learning", IEEE International Conference on Computer Vision (ICCV'07), Rio de Janeiro, Brazil, Oct. 2007. ( pdf )

  8. Deng Cai, Xiaofei He, and Jiawei Han. "Efficient Kernel Discriminant Analysis via Spectral Regression", Proc. 2007 Int. Conf. on Data Mining (ICDM'07), Omaha, NE, Oct. 2007. ( pdf )

  9. Deng Cai, Xiaofei He, and Jiawei Han. "Spectral Regression: A Unified Approach for Sparse Subspace Learning", Proc. 2007 Int. Conf. on Data Mining (ICDM'07), Omaha, NE, Oct. 2007. ( pdf )

  10. Deng Cai, Xiaofei He, Wei Vivian Zhang, and Jiawei Han. "Regularized Locality Preserving Indexing via Spectral Regression", Proc. 2007 ACM Int. Conf. on Information and Knowledge Management (CIKM'07), Lisboa, Portugal, Nov. 2007. ( pdf )

  11. Deng Cai, Xiaofei He, and Jiawei Han. "SRDA: An Efficient Algorithm for Large Scale Discriminant Analysis", IEEE Transactions on Knowledge and Data Engineering, vol. 20, no. 1, pp. 1-12, January, 2008. ( pdf )

  12. Deng Cai, Xiaofei He, and Jiawei Han. "Speed Up Kernel Discriminant Analysis", The VLDB Journal, vol. 20, no. 1, pp. 21-33, January, 2011. ( pdf )


Return to Codes and Data