There is always a need to validate the stability of your machine learning model. I mean you just can’t fit the model to your training data and hope it would accurately …
K-Fold Cross Validation
Stratified K-Fold Cross Validation
Above explained validation techniques are also referred to as Non-exhaustive cross validation methods. These do not compute all ways of splitting the original sample, i.e. you just have to decide how many subsets need to be made. Also, these are approximations of method explained below, also called Exhaustive Methods, that computes all possible ways the data can be split into training and test sets.