Principal Component Analysis (PCA) traditionally relies on the eigenvalue decomposition of covariance matrices—a process that is susceptible to numerical instability, particularly in high-dimensional or noisy datasets. In this paper, we review a numerically stable PCA framework that circumvents the explicit construction of the covariance matrix by directly applying Singular Value Decomposition (SVD) to the centered data matrix. This approach mitigates the risks associated with ill-conditioned covariance computations, reduces computational overhead, and enhances robustness under finite-precision arithmetic. Leveraging the orthogonality and optimal low-rank approximation properties inherent to SVD, the proposed method enables reliable dimensionality reduction while preserving essential data characteristics.