In Machine learning, feature selection is selecting which features to include for training depending on the data. It is very important since using all the features for training a model tends to overfit a model.
One basic method for feature selection is using correlation matrix to check which variables are more correlated with the target variable. Use only those variables which are more correlated with target variable to train a model.
We can get the important features of the dataset by using feature importance metric of the model. All tree-based classifiers have inbuilt libraries to calculate feature importance metric. Calculate feature importance score to see which features are more relevant towards the target variable.
You can register in this Machine learning course by Intellipaat to learn Machine learning.
You can watch this video on Machine learning Full course to learn about feature selection techniques: