Updated on 22nd Dec, 23 9.1K Views

What is TensorFlow?

The term ‘TensorFlow’ is derived from ‘tensor’ and ‘flow’ that represent the flow of tensors. Tensor can be considered as an array of data in all types. The values present in a tensor hold an identical data type with the known dimensions of the array.

A tensor can be derived from a given input dataset.

Watch this Artificial Intelligence with TensorFlow video for beginners:

Youtube subscribe

To recognize the structure of a tensor, we need some parameters, and they are as follows:

  • Rank: Rank is used to identify the number of dimensions of a tensor. It is known as the order of a tensor.
  • Shape: It is the number of rows and columns the tensor has.
  • Type: It is the data type assigned to the tensors.

Representation of a Tensor

For instance, we have a 4×4 matrix with values from 1 to 16:

TensorFlow Tutorial-Intellipaat

In TensorFlow, we will represent it as:

[ [1, 2, 3, 4], [5, 6, 7, 8], [9, 10, 11, 12], [13, 14, 15, 16] ]

Representation of a 3-dimensional tensor with values from 1 to 16:

TensorFlow Tutorial-Intellipaat

In TensorFlow, we will represent it as:

[ [[10, 20, 30], [[40, 50, 60], [[70, 80, 90]]

Check out our comprehensive blog on Tensorflow Interview Questions that will help you to crack your next job interview.

Why TensorFlow and why is it so popular?

TensorFlow is an open-source Deep Learning library that is popular because it involves graphical computation. So, first building a computational graph helps in visualizing the structure of a neural network and the sequence of operations in TensorBoard. This in turn helps in debugging and resolving errors.

Thus, data flow graphs are the basic building blocks for building a deep neural network with the help of TensorFlow.

Now, we will move further into this TensorFlow tutorial and look into data flow graphs.

Interested in learning Artificial Intelligence? Click to learn more Artificial Intelligence Training in New York! 

Graphs in TensorFlow

First step toward making a TensorFlow program involves building of graphs so that we can visualize the sequence of operations. Also, graphs in TensorFlow help in knowing the dependencies between the operations.

So, a graph consists of operations that has dependencies among themselves. Each operation is connected to one another, and these operations are called op nodes. The edges of the nodes represent the tensors.

Programs in TensorFlow involve two steps:

  1. Building a computational graph: Building a computational graph is just a graphical representation of the TensorFlow operations denoted by the nodes. It is a structure we follow while creating TensorFlow programs for Deep Learning models.

Example: In this TensorFlow example, we need to calculate the volume and area of a cuboid. For that we will build a computational graph that consists of nodes, and these nodes perform certain operations. Tensors are passed as inputs to these nodes:

Here, L = length, B = breadth, H = height and T will be the output of the tensor which is the Volume of the cuboid.

After giving the inputs as length, breadth, and height, V is the volume and output of this graph. We will give the inputs and it will give the output as a tensor.

It’s just a representation; no computation is done here. According to this data flow graph, we will write the code for calculating the area and volume of the cuboid.

Code:

import TensorFlow  as tf

L=tf.constant(10)

B=tf.constant(20)

H=tf.constant(30)

V=L*B*H

A=2(L*B+B*H+H*L)

Learn new Technologies

  1. Running a computational graph: Previously, in this TensorFlow tutorial, we have built a computational graph and defined all the operations that need to be performed. Now, we need to execute it, and for that we will use a TensorFlow

Session: It places the graph operations onto devices and defines the sequence of operations. A session encapsulates the control and state of the TensorFlow runtime. Now, we will see how to run a graph within a session.

Code:

# First, we will create a session object

session= tf.Session()

#Second, we will run the graph within a session, and the output will be stored in Out variable

Out=session.run(V) // V is the volume of cuboid from previous example

#Now, we can print the output

print(Out)

#After printing the output, close the session to free up resources

session.close()

Output:

6000 units

As we have seen the basic building blocks of neural networks known as graphs in this TensorFlow tutorial, we will now look into constants, variables, and placeholders.

Go through this Artificial Intelligence Course in London to get a clear understanding of Artificial Intelligence!

Watch This Video on Keras vs Tensorflow by Intellipaat

Youtube subscribe

Constants, Variables, and Placeholders

TensorFlow Constants

TensorFlow constant is the simplest category of TensorFlow which is not trainable and does not have a fixed dimension. It is used to store constant values.

It is created using the constant function.

Declaration of a TensorFlow constant:

constant(value, dtype=None, shape=None, name=’ Length ’, verify_shape=False )

where, value is a constant value that will be used; dtype is the data type of the value (float 32/64, int 8/16, etc.); shape defines the shape of the constant (it’s optional); name defines the optional name of the tensor, and verify_shape is a Boolean value that will verify the shape.

Now, let’s look at a TensorFlow example to illustrate the use of a TensorFlow constant.

In this example, we are having the length (L) and the breadth (B) of a rectangle, and we need to calculate the area (A) of the rectangle.

Code:

import TensorFlow  as tf

L=tf.constant(10, name=”length”, dtype=tf.int32)

B=tf.constant(20, name=”breadth”, dtype=tf.int32)

A=L*B

Output:

200

Become an Artificial Intelligence Engineer

TensorFlow Placeholders

TensorFlow placeholder is basically used to feed data to the computation graph during runtime. Thus, it is used to take the input parameter during runtime. We need to use the feed_dict method to feed the data to the tensors.

Declaration of TensorFlow Placeholder:

placeholder(dtype, shape=None, name=None)

Here, dtype is the data type of the tensor; shape defines the shape of the tensor, and name will have the optional name of the tensor.

Now, let’s look at a TensorFlow example for demonstrating the use of TensorFlow placeholder.

Here, we will calculate the area of a trapezium.

L1 = the length of the shorter side, L2 = the length of the longer side, H = the height of the trapezium, and A = the area of trapezium.

Code:

import TensorFlow  as tf

#First, create TensorFlow placeholders

L1= tf.placeholder(tf.float32)

L2= tf.placeholder(tf.float32)

H= tf.placeholder(tf.float32)

#Defining operation for calculating area of trapezium

A=1/2 [(L1+L2)*H]

#Now, creating a session object

session= tf.Session()

#Running the session for calculating area A by passing the value of L1 [10, 20], L2 [20, 30], and H [12,16]

Output=session.run (A, { L1 : [10, 20], L2 : [20, 30], H : [12, 16] } )

print( ‘ Area of Trapezium: ’, output )

Output:

[ 180.  400. ]

Are you interested in learning Artificial Intelligence from experts? Enroll in our Artificial Intelligence Course in Bangalore now!

TensorFlow Variables

So far we have seen TensorFlow constant and TensorFlow placeholders in this TensorFlow tutorial, but they are not enough for building neural networks. Since TensorFlow is widely used for building deep neural networks and the data we inject into the neural network is not a constant, it always has different values. We need to train the neural network for giving better results, and for that we require large amounts of data. So, there is a need to introduce a TensorFlow variable that can be used to modify the input data (tensors) at a later stage so that we can provide plenty of trainable data to the model. Basically, variables are used to clutch parameters and then update them at a later stage.

Declaration of TensorFlow Variable: If we want to initialize the variable with some random values that will be used for training the model and update it at a later stage, then we can declare it in the following way:

Var= tf.Variable(tf.zeros ( [1] ), dtype=tf.float32, name= ”Var” )

If we are using a non-trainable variable in TensorFlow, then we will declare it as follows:

Var=tf.Variable( tf.add( x, y), trainable=False)

As we know, constants can be initialized using tf.constant(), and variables are initialized explicitly as:

init= tf.global_variables_initializer()

session.run(init)

Note: Before using a graph, we must initialize all the variables.

Let’s now check out a TensorFlow example.

Linear Regression Model Using TensorFlow

Linear regression helps in understanding the linear relationship between dependent and independent variables.

TensorFlow Tutorial-Intellipaat

Linear regression is a supervised Machine Learning algorithm which helps in finding the linear relationship between the two variables. We try to understand how the unknown variable changes w.r.t. the known variable.

Y = m*X + C

Now, we build a linear model using TensorFlow.

Code:

import TensorFlow  as tf

#Parameters to be used are declared

m= tf.variable( [2.7], dtype=tf.float32)

C= tf.variable( [-2.0], dtype=tf.float32)

#input

X=tf.placeholder(tf.float32)

#Equation

Y = m*X + C

#Initializing all the variables

sess = tf.Session()

init = tf.global_variables_initializer()

sess.run(init)

#Running the linear regression model

print(sess.run( Y { X :[10, 20, 30, 40]}))

Output:

[ 25.0  52.0  79.0  106.0]

So, to summarize this TensorFlow tutorial, making TensorFlow programs involve three components:

  • Graph: It is the basic building block of TensorFlow that helps in understanding the flow of operations.
  • Tensor: It represents the data that flows between the operations.
  • Session: A session is used to execute the operations.

This is how TensorFlow works.

Watch this video on Introduction to TensorFlow with an example:

Youtube subscribe

If you have any doubts or queries related to Artificial Intelligence, get them clarified from the Artificial Intelligence experts on our Artificial Intelligence Community!

Course Schedule

Name Date Details
Artificial Intelligence Course 23 Mar 2024(Sat-Sun) Weekend Batch
View Details
Artificial Intelligence Course 30 Mar 2024(Sat-Sun) Weekend Batch
View Details
Artificial Intelligence Course 06 Apr 2024(Sat-Sun) Weekend Batch
View Details

Speak to our course Advisor Now !

Related Articles

Subscribe to our newsletter

Signup for our weekly newsletter to get the latest news, updates and amazing offers delivered directly in your inbox.