Gaussian Smoothing uses the sigma and the window size. And it blurs the image to reduce the noise from the image. On the other hand, Mean Filter also blurs the image and removes the noise. What is the basic difference in the result?
A Gaussian filter is a linear filter. Basically, it is used to blur the image or to reduce noise. If you use two of them and subtract, you can use them for "unsharp masking" (edge detection). The Gaussian filter can alone be able to blur edges and reduce contrast.
It is a non-linear filter that is mostly used as a simple way to reduce noise in an image. It claims to fame (over Gaussian for noise reduction) is that it removes noise while keeping edges relatively sharp.
I guess the one advantage a Gaussian filter has over a median filter is that it's faster because multiplying and adding is probably faster than sorting.