Firstly, AdaBoost is a meta-algorithm that is used in conjunction with (on top of) your favorite classifier. It combines the weak classifier algorithm to form a strong classifier. A single algorithm may classify the objects poorly. But if we combine multiple classifiers with a selection of training set at every iteration and assigning the right amount of weight in the final voting, we can have a good accuracy score for the overall classifier. Secondly, classifiers that work well in one problem domain often don't work well in another.
You can refer the following link for knowing the up-to-date comparison of state-of-the-art classification algorithms: https://www.sciencedirect.com/science/article/pii/S0957417417302397