0 votes
1 view
in Data Science by (17.6k points)

Suppose I build a classification model and then to improve, lets say,precision I just increase my threshold probability of higher class. Does this make sense? I am not changing the model but just changing the threshold probability to get better answer. Is it ok? Thanks

1 Answer

0 votes
by (38.2k points)

It is perfect to do parameter tuning, such as according to your data setting a threshold .

Things to keep in mind:

1.You should make a train-test split of your data. 

2.the test data should only be used once at the very end, when you want to calculate how well your algorithm performs.

3.The training data is used to calculate your parameters,. 

4.If you need two datasets to calculate your parameters, then split your training dataset again.

If you wish to learn Data Science then visit this Data Science Course.

Welcome to Intellipaat Community. Get your technical queries answered by top developers !


Categories

...