@saadmalik946

You definitely deserve more subscribers, mate. Keep up the good work. Absolutely clear and easy to follow.

@wlxxiii

Have been following your channel for a while. Think they're really useful as there's not much videos around explaining the math involved in various techniques. Pls keep doing what you are doing! Thanks!

@gajendrasinghdhaked

goated mind blowing brainstroming i am crying what a spectacular content my brother

@CTT36544

2:06  A typo is in the dimensions of X and Z, where they flip the row and column dimensions.


2:16  Something is not proper here when he says S is the covariance matrix of z. 
Note that
Cov(a,b)=E(ab)-E(a)E(b), \forall a,b \in X,
so this S is only the first part of the covariance matrix, rather than the whole part. Indeed, the complete covariance matrix is
Cov(Z) = E(ZZ') - uu' = 1/N(ZZ')-(1/N^2)Z11'Z', 
where u_i is the mean of z_i (u:=[u_1, ... , u_M]) and 1's are column vectors. By comparison, see 5:40. Note that \sum_i \lambda_i=trace(Cov(Z))=\sum_i Var(z_i), and hence he claims that the sum of eigenvalues equals to the variance is correct. However, again, Cov(Z)~=1/N(ZZ').

@ssshukla26

To fast to follow. If I already knew the topic I can definitely watch this video to look for a quick reference, else, am still in limbo.

@harry5094

HI,
Can you please give some intuition on the Information equation? Why is it equal to 1/N ZT Z at 2:14?

@MrZidane1128

there are lots of knowledge points I need to recall and review from undergraduate study in order to fully understand this

@fadydawra

Thanks. Just a question: if you have multi groups of points to separate them. What is the method of ML could solve this problem. I need an unsupervised algorithm.

@PerpetuityLJW

Beautiful as always, keep up the good work!

@MultiPRAKS

Based on the dimensions of Z, U and X, the expression Z=UX does not hold. Can you clarify this point?

@jasdn93bsad992

At 2:08, UX = (D by M) (N by D) => what is the shape of Z?

@FirstNameLastName-fv4eu

very nice video!! keep making.

@tiborcamargo5732

Nice, straight to the point.

@rexwinn3934

How do you determine what dimension value of m that you want? And how do you figure how accurate the new m dimensional data set is in reflecting the original data set? Do you just compare variances?

@rajatshrivastav

I am confused as in if S is the covariance matrix of Z which itself has M dimensions, then how could we end by selecting first M Eigenvalue and Eigenvector pairs, it should be S is the covariance matrix of X having D dimension and then Diagonalizing S we select M pairs
Could you please explain this @CodeEmporium@author

@haris525

Nice! Very clear information.

@MrFischvogel

Great work! Thanks a lot!

@driver13g27

Maybe good for intuition, but YouTube videos are generally a bad choice for scientific papers, theses etc., especially regarding uncorrected mistakes and your lack of sources where you got the PCA calculations from (not the eigenvector and diagonalization reference stuff), I recommend Google Scholar, e.g., I Jolliffe - 2011 - Springer "Principal component analysis" for a good mathematical calculation that one can understand based on the intuition of, e.g., StatQuest's explanation here on YouTube. Better sources exist for both the mathematical formulas and the intuition, this video while not awful doesn't do either really well tbh

@ccuuttww

same process to SVD transform a hyper sphere into 2D ellipseĀ 
but I cannot understand the part of feature mapping will u make a video with example and show the calculation?

@cat-chemometricagiletool4500

Great video, keep it up! If you want to try an agile software to perform PCA use CAT - Chemometric Agile Tool"!