In Data Science, optimization algorithms are used to find the best parameters for the model that you are building. Let us understand the optimization by understanding the popular optimization algorithm, Stochastic Gradient Descent. This Stochastic Gradient Descent algorithm is used to find the parameters or weights that minimize the cost function. The cost function defines the goodness of the model in predicting the target variable for those particular parameters. In this optimization algorithm, we randomly initialize the values of parameters and slowly moves towards the steepest descent until the convergence point. We should initialize the hyperparameter learning rate to define the size of the steps. This hyperparameter should not be too large (can result in not meeting the convergence point) or too small (takes a lot of time to meet convergence point).
If you are interested in Data Science, I would suggest you enroll in these Data Science courses by Intellipaat.