NG week8 PCA

编程知识 行业动态 更新时间:2024-06-13 00:19:13

答案:1. AB 2. D 3.C 4. BD 5. AD

Principal Component Analysis

5 试题

1. 

Consider the following 2D dataset:

Which of the following figures correspond to possible values that PCA may return for  u(1)  (the first eigenvector / first principal component)? Check all that apply (you may have to check more than one figure).

2. 

Which of the following is a reasonable way to select the number of principal components  k ?

(Recall that  n  is the dimensionality of the input data and  m  is the number of input examples.)

Choose  k  to be 99% of  n  (i.e.,  k=0.99n , rounded to the nearest integer).

Choose  k  to be the smallest value so that at least 1% of the variance is retained.

Choose the value of  k  that minimizes the approximation error  1mmi=1||x(i)x(i)approx||2 .

Choose  k  to be the smallest value so that at least 99% of the variance is retained.

3. 

Suppose someone tells you that they ran PCA in such a way that "95% of the variance was retained." What is an equivalent statement to this?

1mmi=1||x(i)x(i)approx||21mmi=1||x(i)||20.95

1mmi=1||x(i)x(i)approx||21mmi=1||x(i)||20.05

1mmi=1||x(i)x(i)approx||21mmi=1||x(i)||20.05

1mmi=1||x(i)x(i)approx||21mmi=1||x(i)||20.95

4. 

Which of the following statements are true? Check all that apply.

Feature scaling is not useful for PCA, since the eigenvector calculation (such as using Octave's  svd(Sigma)  routine) takes care of this automatically.

Given an input  xRn , PCA compresses it to a lower-dimensional vector  zRk .

PCA can be used only to reduce the dimensionality of data by 1 (such as 3D to 2D, or 2D to 1D).

If the input features are on very different scales, it is a good idea to perform feature scaling before applying PCA.

5. 

Which of the following are recommended applications of PCA? Select all that apply.

Data visualization: Reduce data to 2D (or 3D) so that it can be plotted.

To get more features to feed into a learning algorithm.

Clustering: To automatically group examples into coherent groups.

Data compression: Reduce the dimension of your input data  x(i) , which will be used in a supervised learning algorithm (i.e., use PCA so that your supervised learning algorithm runs faster).

更多推荐

NG week8 PCA

本文发布于:2023-03-27 23:22:00,感谢您对本站的认可!
本文链接:https://www.elefans.com/category/jswz/bbd26a7080b7f93feb227488e70db28a.html
版权声明:本站内容均来自互联网,仅供演示用,请勿用于商业和其他非法用途。如果侵犯了您的权益请与我们联系,我们将在24小时内删除。
本文标签:NG   PCA

发布评论

评论列表 (有 0 条评论)
草根站长

>www.elefans.com

编程频道|电子爱好者 - 技术资讯及电子产品介绍!