- pca: https://www.mathworks.com/help/stats/pca.html
- Orthogonal Projection: https://textbooks.math.gatech.edu/ila/projections.html
Choose subset of basis vectors for best approximation of a vector
6 visualizaciones (últimos 30 días)
Mostrar comentarios más antiguos
Good morning,
I'd like to expand a given vector in a different basis with different coefficients , which I receive after a basis transformation. Now my task is to approximate v with a subset . How can I find/select those from for a given k, which approximate v best? I read that singular value decomposition might help, but I do not know if this is really the case.
Thank you very much for your help!
0 comentarios
Respuestas (1)
Saarthak Gupta
el 15 de Dic. de 2023
Hi,
I understand that you are trying to obtain an approximation to a n-dimensional vector v, in a k-dimensional subspace (spanned by a k-subset of the n basis vectors of v).
Different k-subsets of the basis B’ may span different subspaces. For e.g., for k = 2, the subspace spanned by (b1’, b2’) may not be the subspace spanned by (b3’, b4’).
For very small values of n and k, the following approach would work:
1. Take a k-combination of basis vectors in B’, call it A. Let W = Col(A)
2. Calculate the orthogonal projection of v w.r.t W using the following theorem
Here, xw is the said orthogonal projection. (vw in your case)
3. ||v-vw|| will give the “distance” or “error” of the projection.
4. Repeat the steps for all k-combinations of B’. Choose the combination with the least error.
However, this approach does not scale well with increasing values of n and k.
If your goal is reducing the dimensionality of B’, you may use PCA (or SVD) instead. PCA (Principal Component Analysis) can be used to reduce dimensional representation of data, with the smallest reconstruction error.
The “pca” function in the Statistics and Machine Learning Toolbox can be used to achieve the same.
In MATLAB coeff = “pca(X)” returns the principal component coefficients for the n-by-p data matrix X. Each column of “coeff” contains coefficients for one principal component, and the columns are in descending order of component variance. By default, “pca” centers the data and uses the singular value decomposition (SVD) algorithm.
Please refer to the following code:
load hald
coeff = pca(ingredients, "NumComponents", 4);
% Using the “NumComponents” argument, you can specify a suitable value of k.
Please refer to the following documentations for further reference:
Hope this helps!
Best Regards,
Saarthak
0 comentarios
Ver también
Categorías
Más información sobre Dimensionality Reduction and Feature Extraction en Help Center y File Exchange.
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!