I dedicate this page to be a portal that links to many of the webpages that I viewed and learnt for the course “Advanced Machine Learning”. I could create other posts regarding AML and will put link to this master home page for further reference.
Unsupervised Learning
Projection
Principle components analysis
Variance, std deviation as well as covariance matrix are important concepts for understaing PCA, check this page written in Chinese, it provides math equations as well as python code for demonstration, one thing to be careful is the Bessel’s correction and how to use numpy to toggle the bias argument.
Based on this page, PCA is used to reduce the dimension of features space and still keep major dominant features. In the page, one animation demonstrates the projection of a group of data depending on the rotation of the new axis for dimention reduction, it is clear that selection of the new axis is related to variance of the projected data. In addition, to think recursively, for high dimensional reduction, when one primary axis is found, how to find another secondary axis is the question.
Leave a Reply