Tensorflow was using Static Graphs before. It means the graph will be executed after completion but can’t execute it in the middle. Even though we had Interactive Session and Eager Execution but it didn’t work that good.
PyTorch is using Dynamic Graph. Dynamic Graphs are just like normal programs. Dynamic Graphs means they can be executed even in the middle and at any instance of code. So, PyTorch is easier to customize and debug.
But, Google rocked by the new release of the TensorFlow 2.0 version. Now the default Tensorflow graph is dynamic. You can check out this TensorFlow Tutorial by Intellipaat to learn the TensorFlow framework.
You can watch this video to understand more: