There are many techniques available that could help us achieve the correctness of neural network:

The first step in ensuring your neural network performs well on the testing data is to verify that your neural network does not overfit. Overfitting happens when your model starts to memorize values from the training data instead of learning from them.
How to identify if your model is overfitting? you can just cross check the training accuracy and testing accuracy. If training accuracy is much higher than testing accuracy then you can posit that your model has overfitted. There are some techniques to avoid overfitting:
Regularisation of data (L1 or L2).
Dropouts — Randomly dropping connections between neurons, forcing the network to find new paths and generalize.
Early Stopping — Precipitates the training of the neural network, leading to a reduction in error in the test set.
Ensemble of Algorithms
If individual neural networks are not as accurate as you would like them to be, you can create an ensemble of neural networks and combine their predictive power. You can choose different neural network architectures and train them on different parts of the data and ensemble them and use their collective predictive power to get high accuracy on test data.
You do check the correctness of neural network by training it with data, and then verifying with other data, and having a feedback loop in the middle which lets you know if the neural network is functioning appropriately.