Both are similar. Gradient ascent is just the process of maximizing, instead of minimizing, a loss function. Everything else is completely the same. Ascent for some loss function, you could say, is like gradient descent on the negative of that loss function.
Hope this answer helps you! Thus, for more study Machine Learning Tutorials. Also, Machine Learning Algorithms will be quite useful in making a better career.