Intellipaat Back

Explore Courses Blog Tutorials Interview Questions
+1 vote
6 views
in Machine Learning by (160 points)
While implementing Naive Bayes classifier, I have noticed that using some feature selection, I got 30% text accuracy and 45% of training accuracy. Though this performance is much better but I want the best results.
I have also tried ADaBoost with NB but the results wasn’t so good. So Can anyone suggest me some other extensions to NB which may give better accuracy?

1 Answer

+3 votes
by (10.9k points)

Naive Bayes is astonishingly accurate but if you still want to improve the accuracy of classifier you can refer the following-

l.Take a look at the data fed of the classifier

2. Try Tuning your classifier

3. Use some classifier combining technique(ego-boosting, ensembling, etc)

 You should also focus on your data which is the quality of pre-processing and feature selection.

Pre-processing:

It is used to transform a series of strings of raw text into a structured vector through the processes like stemming, synonym finding, neutral words.

Feature selection:

This is a process by which you automatically or manually select features which will contribute most to the prediction variable or the output in which you are actually interested in. It filters features and removes all the irrelevant features which may decrease the accuracy of the model.

The Fisher Method:

The Fisher method is one of the best ways to optimize Naive Bayes Classifier.NBCs uses the feature probabilities in order to construct a whole document probability. Moreover, this method also calculates the probability of each category for each feature and then combines the probabilities of these features to compare it with the probability of a random set of features.

If you wish to learn What is Machine Learning then visit this Machine Learning Course.

31k questions

32.8k answers

501 comments

693 users

Browse Categories

...