{"id":4950,"date":"2020-02-27T12:38:18","date_gmt":"2020-02-27T12:38:18","guid":{"rendered":"https:\/\/prwatech.in\/blog\/?p=4950"},"modified":"2024-03-15T11:53:09","modified_gmt":"2024-03-15T11:53:09","slug":"principal-component-analysis-tutorial","status":"publish","type":"post","link":"https:\/\/prwatech.in\/blog\/machine-learning\/machine-learning-modules\/principal-component-analysis-tutorial\/","title":{"rendered":"Principle Component Analysis Tutorial"},"content":{"rendered":"<h1 style=\"text-align: center;\">Principal Component Analysis Tutorial<\/h1>\n<p><strong>Principal Component Analysis Tutorial<\/strong>, in this Tutorial one, can learn types of principal component analysis. Are you the one who is looking for the best platform which provides information about know working principle of PCA, Applications of Principal Component Analysis? Or the one who is looking forward to taking the advanced <a href=\"https:\/\/prwatech.in\/data-science-certification-course-in-bangalore\/\" target=\"_blank\" rel=\"noopener noreferrer\">Data Science Certification Course<\/a> with Machine Learning from India\u2019s Leading <a href=\"https:\/\/prwatech.in\/data-science-training-institutes-in-bangalore\/\" target=\"_blank\" rel=\"noopener noreferrer\">Data Science Training institute<\/a>? Then you\u2019ve landed on the Right Path.<\/p>\n<p>With the advancements within the field of Machine Learning and computer science, it&#8217;s become essential to grasp the basics behind such technologies. This blog on Principal Component Analysis will facilitate your understanding of the concepts behind dimensionality reduction and the way it may be wont to house high-dimensional data.<\/p>\n<p>The Below mentioned Principal Component Analysis Tutorial will help to Understand the detailed information about what is PCA in machine learning, so Just follow all the tutorials of India\u2019s Leading <a href=\"https:\/\/prwatech.in\/data-science-training-institutes-in-bangalore\/\" target=\"_blank\" rel=\"noopener noreferrer\">Best Data Science Training institute in Bangalore<\/a> and Be a Pro <a href=\"https:\/\/prwatech.in\/data-science-certification-course-in-bangalore\/\" target=\"_blank\" rel=\"noopener noreferrer\">Data Scientist<\/a> or Machine Learning Engineer.<\/p>\n<h2>What is PCA in Machine Learning?<\/h2>\n<p>Suppose you have to deal with the dataset based on GDP (Gross Domestic Production) of any country.\u00a0 While processing on data you will come across so many fields, which we call as features or variables. Now the question arises, how we can take all variables and select only a few of them for further processing? There may be a problem of overfitting the model. So we have to reduce the dimensionality of that data so that it will be easier and we can have a smaller amount of relationships between variables to consider. So, reducing dimensions of feature space is nothing but \u2018Dimensionality Reduction.<\/p>\n<h3>Types of Principal Component Analysis<\/h3>\n<p>There are different methods to achieve dimensionality reduction. Mostly used methods are categorized into:<\/p>\n<p>Features Elimination<\/p>\n<p>Features Extraction<\/p>\n<h4>Feature Elimination<\/h4>\n<p>As the name suggests, in this method the features are eliminated which are having very low significance. Here only the best features are selected as per domain requirement. The rest features are eliminated. Simplicity and maintaining interpretability are the advantages of variables in this method.<\/p>\n<h4>Feature Extraction<\/h4>\n<p>Suppose we have N number of independent variables. In feature extraction, we create N \u201cnew\u201d independent variables, where each newly generated independent variable is a combination of each of the N \u201cold\u201d independent variables. However, the newly created variables have arranged in a specific order based on how well they predict the dependent variable. While following these methods, we always eliminate or remove the features which are \u2018least important\u2019. And there \u2018Dimensionality reduction\u2019 comes in a picture. As we ordered the new variables by how well they predict our dependent variable, we are well known about which variable is the most important and least important. Still, we are keeping the most valuable parts of our old variables, even when we drop one or more of these \u201cnew\u201d variables.<\/p>\n<p><strong>\u00a0<\/strong><\/p>\n<h3>When PCA can be used?<\/h3>\n<p>If we want to reduce the number of variables, but we are unable to identify which variables to completely remove from consideration then PCA is the best option. Even, there are some cases where we can make independent variables less interpretable and we have to ensure those variables are independent of each other also.<\/p>\n<p>Principal Component Analysis (PCA) is a numerical procedure that uses an orthogonal alteration. It converts a set of correlated variables to a set of uncorrelated variables. Investigative data analysis and predictive models use PCA as an effective tool. It is also called general factor analysis, as a line of best fit is determined by regression.<\/p>\n<h2>Working Principle of PCA<\/h2>\n<p><img loading=\"lazy\" decoding=\"async\" class=\"aligncenter wp-image-4951\" src=\"https:\/\/prwatech.in\/blog\/wp-content\/uploads\/2020\/02\/PCA-Working-principle.png\" alt=\"Working Principle of PCA\" width=\"850\" height=\"353\" srcset=\"https:\/\/prwatech.in\/blog\/wp-content\/uploads\/2020\/02\/PCA-Working-principle.png 878w, https:\/\/prwatech.in\/blog\/wp-content\/uploads\/2020\/02\/PCA-Working-principle-300x125.png 300w, https:\/\/prwatech.in\/blog\/wp-content\/uploads\/2020\/02\/PCA-Working-principle-768x319.png 768w\" sizes=\"auto, (max-width: 850px) 100vw, 850px\" \/><\/p>\n<p>In simple words, PCA takes a dataset with a lot of dimensions and flattens it to 2 or 3 dimensions. It tries to find a meaningful way to flatten the data by focusing on the things that are different between independent variables.<\/p>\n<p>The image above shows the example of a transformation of high-dimensional data i.e. 3-dimensional data to low dimensional i.e. 2-dimensional data using PCA. Before moving to the actual concept, let\u2019s see some terminologies related to PCA.<\/p>\n<p><strong>Dimensionality:<\/strong> \u00a0It is simply the number of features or the number of columns present in our dataset. We can consider it as a number of random variables in a dataset<\/p>\n<p><strong>Correlation:<\/strong> \u00a0It displays how strongly two variables are related to each other. The value ranges from -1 to +1. Positive indicates that if one variable increases, the other will increase as well, while negative indicates the other decreases on increasing the other. And the modulus value indicates the strength of the relation.<\/p>\n<p><strong>Orthogonal:<\/strong> \u00a0Uncorrelated to every other, i.e., a correlation between any pair of variables is 0.<\/p>\n<p><strong>Eigenvectors:<\/strong> \u00a0Let\u2019s consider a non-zero vector v. Let\u2019s take a square matrix A. So it is an eigenvector of a A, if Av is a scalar multiple of v. Or simply:<\/p>\n<p style=\"text-align: center;\">Av = \u019bv<\/p>\n<p>Here, v is the eigenvector<\/p>\n<p>\u019b is the eigenvalue associated with it.<\/p>\n<p><strong>Covariance Matrix:<\/strong> This matrix consists of covariance between the pairs of variables. The covariance between i-th and j-th variable is nothing but this (i,j)th element.<\/p>\n<h3>Principal Components<\/h3>\n<p>A normalized linear combination of the original predictors in a data set is called a principal component is. In the image above, the principal components are indicated by <em>PC1<\/em>\u00a0and\u00a0<em>PC2<\/em>. We can fit the data into two axes, which are nothing but these principle components i.e. PCs.<\/p>\n<p>PC1 is the first principle axis that spans the most variation, whereas PC2 is the second principal axis which spans the second most variation. Means PC1 will capture the directions where most of the variation is present and PC2 captures the direction with the second-most variation.<\/p>\n<p>The PCs are essentially the linear combinations of the original variables, the weights vector in this combination is actually the eigenvector found which in turn satisfies the principle of least squares.<\/p>\n<p>The PCs are orthogonal in nature.<\/p>\n<p>As we move from the 1st PC to the last one, the variation present in PCs decreases.<\/p>\n<p>Sometimes in regression, outlier detection problems, these least important PCs are useful.<\/p>\n<p>Implementing PCA on a 2-D Dataset<\/p>\n<h3>Step 1: Normalize the data:<\/h3>\n<p>The first step is to normalize data that we have so that PCA works properly. This is done by subtracting respective means from numbers in the respective column. Let\u2019s consider we have two dimensions X and Y. All X will be ?- and all Y will be ?-. This produces a dataset whose mean is zero.<\/p>\n<h3>Step 2: Calculate the covariance matrix<\/h3>\n<p>Since the dataset we took is 2-dimensional, this will give result in a 2&#215;2 Covariance matrix.<\/p>\n<p>Please note that<\/p>\n<p>Var[X1] = Cov[X1,X1]<\/p>\n<p>Var[X2] = Cov[X2,X2].<\/p>\n<h3>Step 3: Calculate the eigenvalues and eigenvectors:<\/h3>\n<p>The next step is to calculate eigenvalues and eigenvectors for the covariance matrix. For a matrix A, \u019b is an eigenvalue which is a solution of the characteristic equation:<\/p>\n<p style=\"text-align: center;\">det( \u019bI &#8211; A ) = 0<\/p>\n<p>Where,<\/p>\n<p>It is the identity matrix of the same dimension as matrix A. It is a required condition for matrix subtraction.<\/p>\n<p>\u2018det\u2019 is the determinant of the matrix.<\/p>\n<p>For each eigenvalue \u019b, a corresponding eigenvector v can be found by solving:<\/p>\n<p style=\"text-align: center;\">( \u019bI &#8211; A )v = 0<\/p>\n<h3>Step 4: Selecting components and forming a feature vector:<\/h3>\n<p>We order eigenvalues from largest to smallest so that it gives us components in order or significance. Here comes the dimensionality reduction part. If we have data with n variables, then we have corresponding n eigenvalues and eigenvectors. It turns out that the eigenvector corresponding to the highest eigenvalue is the principal component of the dataset and it is our call as to how many eigenvalues we choose to move ahead of our analysis with. To reduce dimensions, we choose the first p eigenvalues and ignore rest. We do lose out some information in the process, but if eigenvalues are small, we do not lose much.<\/p>\n<p>Next, we will form a feature vector which is a matrix. This is a matrix of vectors with the eigenvectors which we want to proceed with. Since we have just 2 dimensions in the running example, we can either choose one corresponding to greater eigenvalue or simply take both.<\/p>\n<p style=\"text-align: center;\">Feature Vector = (eig1, eig2)<\/p>\n<h3>Step 5: Forming Principal Components<\/h3>\n<p>This is the final step where we actually form principal components using all math we did till here. For the same, we take the transpose of the feature vector and left-multiply it with the transpose of a scaled version of the original dataset.<\/p>\n<p style=\"text-align: center;\"><img loading=\"lazy\" decoding=\"async\" class=\"aligncenter wp-image-4954\" src=\"https:\/\/prwatech.in\/blog\/wp-content\/uploads\/2020\/02\/Forming-principle-componens-formula.png\" alt=\"Principal Component Analysis Tutorial - Forming Principal Components\" width=\"850\" height=\"131\" srcset=\"https:\/\/prwatech.in\/blog\/wp-content\/uploads\/2020\/02\/Forming-principle-componens-formula.png 344w, https:\/\/prwatech.in\/blog\/wp-content\/uploads\/2020\/02\/Forming-principle-componens-formula-300x46.png 300w\" sizes=\"auto, (max-width: 850px) 100vw, 850px\" \/><\/p>\n<p>Here,<\/p>\n<p>NewData = Matrix consisting of the principal components,<\/p>\n<p>Feature Vector = matrix we formed using eigenvectors we chose to keep, and<\/p>\n<p>Scaled Data is a scaled version of the original dataset<\/p>\n<p>Where T denotes the transpose of matrices.<\/p>\n<p>If we go back to the theory of eigenvalues and eigenvectors, we will see that, essentially, eigenvectors provide us with information about patterns in data. In this example, if we plot eigenvectors on a scatterplot of data, we find that the principal eigenvector actually fits well with data. The other one, being perpendicular to it, does not carry much information, and hence, we are at not at much loss when deprecating it, hence reducing the dimension.<\/p>\n<p>All the eigenvectors of a matrix are orthogonal i.e. perpendicular to each other. So, in PCA, what we do is represents or transforms the original dataset using these orthogonal (perpendicular) eigenvectors instead of representing on the normal x and y\u00a0 axis. We have now classified our data points as a combination of contributions from both x and y. The difference lies when we actually disregard one or many eigenvectors, hence, reducing the dimension of the dataset. Otherwise, in case, we take all eigenvectors into an account, we are just transforming coordinates and hence, not serving a purpose.<\/p>\n<h3>Applications of Principal Component Analysis<\/h3>\n<p>PCA is predominantly used as a type of a dimensionality reduction technique in domains like facial recognition, computer vision, and image compression. It is also used for determining patterns in data of high dimension in fields of finance, data mining, bioinformatics, psychology, etc.<\/p>\n<h3>Step by Step Implementation of PCA using Python<\/h3>\n<h4>Import required libraries<\/h4>\n<p>%matplotlib inline<\/p>\n<h4>Import dataset<\/h4>\n<p>from sklearn.datasets import load_breast_cancer<\/p>\n<p>cancer=load_breast_cancer()<\/p>\n<p>df=pd.DataFrame(cancer[&#8216;data&#8217;],columns=cancer[&#8216;feature_names&#8217;])<\/p>\n<p>df.head()<\/p>\n<p><img loading=\"lazy\" decoding=\"async\" class=\"aligncenter wp-image-4957\" src=\"https:\/\/prwatech.in\/blog\/wp-content\/uploads\/2020\/02\/Dataset-1024x249.png\" alt=\"Step by Step Implementation of PCA using Python\" width=\"850\" height=\"207\" srcset=\"https:\/\/prwatech.in\/blog\/wp-content\/uploads\/2020\/02\/Dataset-1024x249.png 1024w, https:\/\/prwatech.in\/blog\/wp-content\/uploads\/2020\/02\/Dataset-300x73.png 300w, https:\/\/prwatech.in\/blog\/wp-content\/uploads\/2020\/02\/Dataset-768x187.png 768w, https:\/\/prwatech.in\/blog\/wp-content\/uploads\/2020\/02\/Dataset.png 1047w\" sizes=\"auto, (max-width: 850px) 100vw, 850px\" \/><\/p>\n<h4>Normalize DataSet using Standard Scalar<\/h4>\n<p>from sklearn.preprocessing import StandardScaler<\/p>\n<p>scl= StandardScaler()<\/p>\n<p>scl.fit(df)<\/p>\n<p>scl_data=scl.transform(df)<\/p>\n<p>from sklearn.decomposition import PCA<\/p>\n<p>pca=PCA(n_components=2)<\/p>\n<p>pca.fit(scl_data)<\/p>\n<p>x_pca=pca.transform(scl_data)<\/p>\n<p>x_pca.shape<\/p>\n<h4>Import and implement PCA<\/h4>\n<p>#a larger plot<\/p>\n<p>plt.figure(figsize=(8,6))<\/p>\n<p>plt.scatter (x_pca[:,0],x_pca[:,1],c=cancer[&#8216;target&#8217;],cmap=&#8217;viridis&#8217;)<\/p>\n<p># Labeling to axes<\/p>\n<p>plt.xlabel(&#8216;First Principle component&#8217;)<\/p>\n<p>plt.ylabel(&#8216;Second Principle Component&#8217;)<\/p>\n<p><strong>Output:<\/strong><\/p>\n<p><img loading=\"lazy\" decoding=\"async\" class=\"aligncenter wp-image-4958\" src=\"https:\/\/prwatech.in\/blog\/wp-content\/uploads\/2020\/02\/import-n-implement-PCA-oputput.png\" alt=\"Principal Component Analysis Tutorial - Import and implement PCA\" width=\"850\" height=\"648\" srcset=\"https:\/\/prwatech.in\/blog\/wp-content\/uploads\/2020\/02\/import-n-implement-PCA-oputput.png 831w, https:\/\/prwatech.in\/blog\/wp-content\/uploads\/2020\/02\/import-n-implement-PCA-oputput-300x229.png 300w, https:\/\/prwatech.in\/blog\/wp-content\/uploads\/2020\/02\/import-n-implement-PCA-oputput-768x586.png 768w\" sizes=\"auto, (max-width: 850px) 100vw, 850px\" \/><\/p>\n<h3>Display Dimension reduced<\/h3>\n<p>pca.components_<\/p>\n<p><strong>Output:<\/strong><\/p>\n<p><img loading=\"lazy\" decoding=\"async\" class=\"aligncenter wp-image-4959\" src=\"https:\/\/prwatech.in\/blog\/wp-content\/uploads\/2020\/02\/PCA.png\" alt=\"Principal Component Analysis Tutorial - Display Dimension reduced\" width=\"850\" height=\"290\" srcset=\"https:\/\/prwatech.in\/blog\/wp-content\/uploads\/2020\/02\/PCA.png 994w, https:\/\/prwatech.in\/blog\/wp-content\/uploads\/2020\/02\/PCA-300x102.png 300w, https:\/\/prwatech.in\/blog\/wp-content\/uploads\/2020\/02\/PCA-768x262.png 768w\" sizes=\"auto, (max-width: 850px) 100vw, 850px\" \/><\/p>\n<p>We hope you understand Principal Component Analysis Tutorial. Get success in your career as a Data Scientist by being a part of the <a href=\"https:\/\/prwatech.com\/\" target=\"_blank\" rel=\"noopener noreferrer\">Prwatech<\/a>, India&#8217;s leading <a href=\"https:\/\/prwatech.in\/data-science-training-institutes-in-bangalore\/\" target=\"_blank\" rel=\"noopener noreferrer\">Data Science training institute in Bangalore<\/a>.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Principal Component Analysis Tutorial Principal Component Analysis Tutorial, in this Tutorial one, can learn types of principal component analysis. Are you the one who is looking for the best platform which provides information about know working principle of PCA, Applications of Principal Component Analysis? Or the one who is looking forward to taking the advanced [&hellip;]<\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"closed","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[32,1696],"tags":[463,464,459,461,460,462],"class_list":["post-4950","post","type-post","status-publish","format-standard","hentry","category-machine-learning","category-machine-learning-modules","tag-applications-of-principal-component-analysis","tag-implementation-of-pca-using-python","tag-principal-component-analysis-tutorial","tag-types-of-principal-component-analysis","tag-what-is-pca-in-machine-learning","tag-working-principle-of-pca"],"acf":[],"yoast_head":"<!-- This site is optimized with the Yoast SEO plugin v25.7 - https:\/\/yoast.com\/wordpress\/plugins\/seo\/ -->\n<title>Principal Component Analysis Tutorial - Prwatech<\/title>\n<meta name=\"description\" content=\"Explore Principal Component Analysis Tutorial &amp; learn what is pca in machine learning, types, Applications of principal component analysis.\" \/>\n<meta name=\"robots\" content=\"noindex, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1\" \/>\n<meta property=\"og:locale\" content=\"en_US\" \/>\n<meta property=\"og:type\" content=\"article\" \/>\n<meta property=\"og:title\" content=\"Principal Component Analysis Tutorial - Prwatech\" \/>\n<meta property=\"og:description\" content=\"Explore Principal Component Analysis Tutorial &amp; learn what is pca in machine learning, types, Applications of principal component analysis.\" \/>\n<meta property=\"og:url\" content=\"https:\/\/prwatech.in\/blog\/machine-learning\/machine-learning-modules\/principal-component-analysis-tutorial\/\" \/>\n<meta property=\"og:site_name\" content=\"Prwatech\" \/>\n<meta property=\"article:publisher\" content=\"https:\/\/www.facebook.com\/prwatech.in\/\" \/>\n<meta property=\"article:published_time\" content=\"2020-02-27T12:38:18+00:00\" \/>\n<meta property=\"article:modified_time\" content=\"2024-03-15T11:53:09+00:00\" \/>\n<meta property=\"og:image\" content=\"https:\/\/prwatech.in\/blog\/wp-content\/uploads\/2020\/02\/PCA-Working-principle.png\" \/>\n<meta name=\"author\" content=\"Prwatech\" \/>\n<meta name=\"twitter:card\" content=\"summary_large_image\" \/>\n<meta name=\"twitter:creator\" content=\"@Eduprwatech\" \/>\n<meta name=\"twitter:site\" content=\"@Eduprwatech\" \/>\n<meta name=\"twitter:label1\" content=\"Written by\" \/>\n\t<meta name=\"twitter:data1\" content=\"Prwatech\" \/>\n\t<meta name=\"twitter:label2\" content=\"Est. reading time\" \/>\n\t<meta name=\"twitter:data2\" content=\"9 minutes\" \/>\n<script type=\"application\/ld+json\" class=\"yoast-schema-graph\">{\"@context\":\"https:\/\/schema.org\",\"@graph\":[{\"@type\":\"WebPage\",\"@id\":\"https:\/\/prwatech.in\/blog\/machine-learning\/machine-learning-modules\/principal-component-analysis-tutorial\/\",\"url\":\"https:\/\/prwatech.in\/blog\/machine-learning\/machine-learning-modules\/principal-component-analysis-tutorial\/\",\"name\":\"Principal Component Analysis Tutorial - Prwatech\",\"isPartOf\":{\"@id\":\"https:\/\/prwatech.in\/blog\/#website\"},\"primaryImageOfPage\":{\"@id\":\"https:\/\/prwatech.in\/blog\/machine-learning\/machine-learning-modules\/principal-component-analysis-tutorial\/#primaryimage\"},\"image\":{\"@id\":\"https:\/\/prwatech.in\/blog\/machine-learning\/machine-learning-modules\/principal-component-analysis-tutorial\/#primaryimage\"},\"thumbnailUrl\":\"https:\/\/prwatech.in\/blog\/wp-content\/uploads\/2020\/02\/PCA-Working-principle.png\",\"datePublished\":\"2020-02-27T12:38:18+00:00\",\"dateModified\":\"2024-03-15T11:53:09+00:00\",\"author\":{\"@id\":\"https:\/\/prwatech.in\/blog\/#\/schema\/person\/db90baff7744090b2288bbc98fea87f3\"},\"description\":\"Explore Principal Component Analysis Tutorial & learn what is pca in machine learning, types, Applications of principal component analysis.\",\"breadcrumb\":{\"@id\":\"https:\/\/prwatech.in\/blog\/machine-learning\/machine-learning-modules\/principal-component-analysis-tutorial\/#breadcrumb\"},\"inLanguage\":\"en-US\",\"potentialAction\":[{\"@type\":\"ReadAction\",\"target\":[\"https:\/\/prwatech.in\/blog\/machine-learning\/machine-learning-modules\/principal-component-analysis-tutorial\/\"]}]},{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\/\/prwatech.in\/blog\/machine-learning\/machine-learning-modules\/principal-component-analysis-tutorial\/#primaryimage\",\"url\":\"https:\/\/prwatech.in\/blog\/wp-content\/uploads\/2020\/02\/PCA-Working-principle.png\",\"contentUrl\":\"https:\/\/prwatech.in\/blog\/wp-content\/uploads\/2020\/02\/PCA-Working-principle.png\",\"width\":878,\"height\":365},{\"@type\":\"BreadcrumbList\",\"@id\":\"https:\/\/prwatech.in\/blog\/machine-learning\/machine-learning-modules\/principal-component-analysis-tutorial\/#breadcrumb\",\"itemListElement\":[{\"@type\":\"ListItem\",\"position\":1,\"name\":\"Home\",\"item\":\"https:\/\/prwatech.in\/blog\/\"},{\"@type\":\"ListItem\",\"position\":2,\"name\":\"Principle Component Analysis Tutorial\"}]},{\"@type\":\"WebSite\",\"@id\":\"https:\/\/prwatech.in\/blog\/#website\",\"url\":\"https:\/\/prwatech.in\/blog\/\",\"name\":\"Prwatech\",\"description\":\"Share Ideas, Start Something Good.\",\"potentialAction\":[{\"@type\":\"SearchAction\",\"target\":{\"@type\":\"EntryPoint\",\"urlTemplate\":\"https:\/\/prwatech.in\/blog\/?s={search_term_string}\"},\"query-input\":{\"@type\":\"PropertyValueSpecification\",\"valueRequired\":true,\"valueName\":\"search_term_string\"}}],\"inLanguage\":\"en-US\"},{\"@type\":\"Person\",\"@id\":\"https:\/\/prwatech.in\/blog\/#\/schema\/person\/db90baff7744090b2288bbc98fea87f3\",\"name\":\"Prwatech\",\"image\":{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\/\/prwatech.in\/blog\/#\/schema\/person\/image\/\",\"url\":\"https:\/\/secure.gravatar.com\/avatar\/c00bafc1b04045f31eda917de39891456c44fa47c092b9bb6be0f860a3a30a2f?s=96&d=mm&r=g\",\"contentUrl\":\"https:\/\/secure.gravatar.com\/avatar\/c00bafc1b04045f31eda917de39891456c44fa47c092b9bb6be0f860a3a30a2f?s=96&d=mm&r=g\",\"caption\":\"Prwatech\"},\"url\":\"https:\/\/prwatech.in\/blog\/author\/prwatech123\/\"}]}<\/script>\n<!-- \/ Yoast SEO plugin. -->","yoast_head_json":{"title":"Principal Component Analysis Tutorial - Prwatech","description":"Explore Principal Component Analysis Tutorial & learn what is pca in machine learning, types, Applications of principal component analysis.","robots":{"index":"noindex","follow":"follow","max-snippet":"max-snippet:-1","max-image-preview":"max-image-preview:large","max-video-preview":"max-video-preview:-1"},"og_locale":"en_US","og_type":"article","og_title":"Principal Component Analysis Tutorial - Prwatech","og_description":"Explore Principal Component Analysis Tutorial & learn what is pca in machine learning, types, Applications of principal component analysis.","og_url":"https:\/\/prwatech.in\/blog\/machine-learning\/machine-learning-modules\/principal-component-analysis-tutorial\/","og_site_name":"Prwatech","article_publisher":"https:\/\/www.facebook.com\/prwatech.in\/","article_published_time":"2020-02-27T12:38:18+00:00","article_modified_time":"2024-03-15T11:53:09+00:00","og_image":[{"url":"https:\/\/prwatech.in\/blog\/wp-content\/uploads\/2020\/02\/PCA-Working-principle.png","type":"","width":"","height":""}],"author":"Prwatech","twitter_card":"summary_large_image","twitter_creator":"@Eduprwatech","twitter_site":"@Eduprwatech","twitter_misc":{"Written by":"Prwatech","Est. reading time":"9 minutes"},"schema":{"@context":"https:\/\/schema.org","@graph":[{"@type":"WebPage","@id":"https:\/\/prwatech.in\/blog\/machine-learning\/machine-learning-modules\/principal-component-analysis-tutorial\/","url":"https:\/\/prwatech.in\/blog\/machine-learning\/machine-learning-modules\/principal-component-analysis-tutorial\/","name":"Principal Component Analysis Tutorial - Prwatech","isPartOf":{"@id":"https:\/\/prwatech.in\/blog\/#website"},"primaryImageOfPage":{"@id":"https:\/\/prwatech.in\/blog\/machine-learning\/machine-learning-modules\/principal-component-analysis-tutorial\/#primaryimage"},"image":{"@id":"https:\/\/prwatech.in\/blog\/machine-learning\/machine-learning-modules\/principal-component-analysis-tutorial\/#primaryimage"},"thumbnailUrl":"https:\/\/prwatech.in\/blog\/wp-content\/uploads\/2020\/02\/PCA-Working-principle.png","datePublished":"2020-02-27T12:38:18+00:00","dateModified":"2024-03-15T11:53:09+00:00","author":{"@id":"https:\/\/prwatech.in\/blog\/#\/schema\/person\/db90baff7744090b2288bbc98fea87f3"},"description":"Explore Principal Component Analysis Tutorial & learn what is pca in machine learning, types, Applications of principal component analysis.","breadcrumb":{"@id":"https:\/\/prwatech.in\/blog\/machine-learning\/machine-learning-modules\/principal-component-analysis-tutorial\/#breadcrumb"},"inLanguage":"en-US","potentialAction":[{"@type":"ReadAction","target":["https:\/\/prwatech.in\/blog\/machine-learning\/machine-learning-modules\/principal-component-analysis-tutorial\/"]}]},{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/prwatech.in\/blog\/machine-learning\/machine-learning-modules\/principal-component-analysis-tutorial\/#primaryimage","url":"https:\/\/prwatech.in\/blog\/wp-content\/uploads\/2020\/02\/PCA-Working-principle.png","contentUrl":"https:\/\/prwatech.in\/blog\/wp-content\/uploads\/2020\/02\/PCA-Working-principle.png","width":878,"height":365},{"@type":"BreadcrumbList","@id":"https:\/\/prwatech.in\/blog\/machine-learning\/machine-learning-modules\/principal-component-analysis-tutorial\/#breadcrumb","itemListElement":[{"@type":"ListItem","position":1,"name":"Home","item":"https:\/\/prwatech.in\/blog\/"},{"@type":"ListItem","position":2,"name":"Principle Component Analysis Tutorial"}]},{"@type":"WebSite","@id":"https:\/\/prwatech.in\/blog\/#website","url":"https:\/\/prwatech.in\/blog\/","name":"Prwatech","description":"Share Ideas, Start Something Good.","potentialAction":[{"@type":"SearchAction","target":{"@type":"EntryPoint","urlTemplate":"https:\/\/prwatech.in\/blog\/?s={search_term_string}"},"query-input":{"@type":"PropertyValueSpecification","valueRequired":true,"valueName":"search_term_string"}}],"inLanguage":"en-US"},{"@type":"Person","@id":"https:\/\/prwatech.in\/blog\/#\/schema\/person\/db90baff7744090b2288bbc98fea87f3","name":"Prwatech","image":{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/prwatech.in\/blog\/#\/schema\/person\/image\/","url":"https:\/\/secure.gravatar.com\/avatar\/c00bafc1b04045f31eda917de39891456c44fa47c092b9bb6be0f860a3a30a2f?s=96&d=mm&r=g","contentUrl":"https:\/\/secure.gravatar.com\/avatar\/c00bafc1b04045f31eda917de39891456c44fa47c092b9bb6be0f860a3a30a2f?s=96&d=mm&r=g","caption":"Prwatech"},"url":"https:\/\/prwatech.in\/blog\/author\/prwatech123\/"}]}},"_links":{"self":[{"href":"https:\/\/prwatech.in\/blog\/wp-json\/wp\/v2\/posts\/4950","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/prwatech.in\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/prwatech.in\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/prwatech.in\/blog\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/prwatech.in\/blog\/wp-json\/wp\/v2\/comments?post=4950"}],"version-history":[{"count":11,"href":"https:\/\/prwatech.in\/blog\/wp-json\/wp\/v2\/posts\/4950\/revisions"}],"predecessor-version":[{"id":10917,"href":"https:\/\/prwatech.in\/blog\/wp-json\/wp\/v2\/posts\/4950\/revisions\/10917"}],"wp:attachment":[{"href":"https:\/\/prwatech.in\/blog\/wp-json\/wp\/v2\/media?parent=4950"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/prwatech.in\/blog\/wp-json\/wp\/v2\/categories?post=4950"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/prwatech.in\/blog\/wp-json\/wp\/v2\/tags?post=4950"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}