Weak classifiers (or weak learners) are classifiers that perform only slightly better than a random classifier. These are thus classifiers that have some clue on how to predict the right labels, but not as much as strong classifiers have like, e.g., Naive Bayes, Neural Network or SVM.
Ada-boost classifier combines the weak classifier algorithm to form a strong classifier. A single algorithm may classify the objects poorly. But if we combine multiple classifiers with the selection of training set at every iteration and assigning the right amount of weight in the final voting, we can have a good accuracy score for the overall classifier.
In short Ada-boost,
It retrains the algorithm iteratively by choosing the training set based on the accuracy of previous training.
The weight-age of each trained classifier at any iteration depends on the accuracy achieved.