During forward propagation, a series of calculations are performed to generate a prediction and to calculate the cost. The cost is a function that we wish to minimize. Then, backpropagation calculates the gradient or the derivatives. This will be useful during the optimization phase because when the derivatives are close or equal to 0, it means that our parameters are optimized to minimize the cost function.
In a feed-forward, the propagation happens layer by layer, therefore Layer 1 neuron fire first, followed by Layers 2, 3, etc, so the propagation is one neuron activation stimulating activation in the neurons that take it as input.
Alternatively, we can think of propagation instead as the neurons whose inputs are active at any given point in time are the ones to fire. therefore if we have got a time t=0 were Layer 1 neurons are active, at the next time t=1 the next layer Layer 2 will activate since the neurons in Layer 2 take the neurons in Layer 1 as input.
If you want to know more about Neural Network visit this Neural Network Tutorial.