1. Ng
A. (n.d.). CS229 Lecture Notes: Principal Component Analysis. Stanford University. Retrieved from http://cs229.stanford.edu/notes/cs229-notes10.pdf. In Section 1
"Application 1: Dimensionality reduction" (p. 2)
the notes explicitly state
"By far the most common application of PCA is for dimensionality reduction."
2. Shlens
J. (2014). A Tutorial on Principal Component Analysis. arXiv:1404.1100 [cs.LG]. In the Abstract (p. 1)
the author states
"Principal component analysis (PCA) is a mainstay of modern data analysis - a black box that is widely used but poorly understood. The goal of this paper is to dispel the magic behind this black box. This manuscript focuses on building a solid intuition for how and why principal component analysis works. This is accomplished by forging connections between the underlying statistics and geometry of PCA. This tutorial avoids the use of heavy mathematical notation and instead focuses on building a visual
intuitive framework for what PCA is actually doing. With minimal additional effort
PCA provides a roadmap for how to reduce a complex data set to a lower dimension to reveal the sometimes hidden
simplified dynamics that often underlie it."
3. Scikit-learn Developers. (n.d.). 2.5. Decomposing signals in components (matrix factorization problems). Scikit-learn 1.4.2 documentation. In Section 2.5.1
"Principal component analysis (PCA)
" the documentation introduces PCA as a method to "...decompose a multivariate dataset in a set of successive orthogonal components that explain a maximum amount of the variance." This decomposition is the basis for its use in dimensionality reduction.