After multiple comparisons and deletion of the unnecessary variables we may be able to increase the accuracy value of the model. There can be cases when collinearity between multiple variables may exist, in such a scenarios, there can be a downfall in the accuracy measure of the model. When it comes to model building using linear regression, we tend to see coefficient of determination, R 2, for accuracy of the model built.įor example, consider a large dataset that has 8 attributes and 1 target variable. Principle of Parsimony in Regression technique under the Machine Learning domain: So, before we construct any algorithm, we ought to consider a theory that would provide the shortest and the best path without affecting the time and cost that it takes to reach the destination.Įxample: If we have to reach Delhi from Haridwar, the wise way would be to select the simplest and safest path rather than choosing a complex route which takes huge amount of time and fuel cost. Example: Prim’s algorithm, Krushkal’s algorithm etc. This route selection can be made using many algorithms available in data structures. In Data Structures, we come across a theory of shortest spanning tree for simplest route selection. Principle of Parsimony in route selection:.Let us have a look on examples from specific fields. So, according to principle of parsimony, we tend to select Case 2, provided all the evidences are important and relevant. Lets us assume two cases: Case 1, where in there are total 8 supporting evidences to explain an event and Case 2, wherein there are 5 supporting evidences to explain an event. We can use principle of parsimony in many scenarios or events in our day to day life including Data Science model predictions. So we can say that “the assumption which is simplest as well as has all necessary information required to get a hold on the experiment we are into” justifies the principle of Parsimony. However, in principle of parsimony it is more about considering the simplest and relevant explanation. ![]() When we apply principle of parsimony, we tend to select the phenomena with the least entity. The principle of parsimony also referred as Occam’s razor explains the selection of the simplest explanation that fits for best results when we have more than one option to choose. Augmented Reality (AR) & Virtual Reality (VR).Industrial Revolution 4.0/ IR 4.0/ Industry 4.0.Artificial Intelligence (AI) Course Training.Machine Learning Course Training Online.Certified Business Analytics Training Program Online.Artificial Intelligence (AI) Course Training Online.Tableau Desktop 2022 Online Certification Training.Six Sigma Master Black Belt Certification Training.Lean Six Sigma Black Belt Certification Training.Lean Six Sigma Green Belt Certification Training.ServiceNow Development / Administration Certification Training.Eclipse Rich Client Certification Training.Oracle 11G Administration and Management Corporate Training.Cyber Threat Intelligence Advance Course Training. ![]()
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |