Courses
Browse

Artificial Intelligence Engineer Master's Course

Master Program

Intellipaat offers a comprehensive Artificial Intelligence Master's course to become a certified Artificial Intelligence Engineer. This training will help you learn various aspects of AI like Machine Learning, Deep Learning with TensorFlow, Artificial Neural Networks, Statistics, Data Science, SAS Advanced Analytics, Tableau Business Intelligence, Python and R programming and MS Excel through hands-on projects. As a part of online classroom training , you will receive 5 additional self paced courses co-created with IBM namely Machine Learning with Python, Deep Learning with TensorFlow, Build Chatbots with Watson Assistant, R for Data Science, and Python. Moreover, you will also get an exclusive Access to IBM Cloud Platforms which are Cognitive Classes and IBM Watson Cloud Lab.

In Collaboration with IBM
  • 8+

    Courses

  • 22+

    Projects

  • 166

    Hours

  • Online Classroom Training

    • Data Science with R
    • Python
    • Machine Learning
    • Deep Learning with TensorFlow
    • Natural Language Processing
  • Self Paced Training

    • SAS 
    • Excel 
    • Tableau 

Key Features

166 Hrs Instructor Led Training
178 Hrs Self-paced Videos
286 Hrs Project work & Exercises
Certification and Job Assistance
Flexible Schedule
Lifetime Free Upgrade
24 x 7 Lifetime Support & Access

Course Fees

Self Paced Training

  • 178 Hrs e-learning videos
  • Lifetime Free Upgrade
  • 24 x 7 Lifetime Support & Access
  • Flexi-scheduling
$562

Online Classroom preferred

  • Everything in self-paced, plus
  • 166 Hrs of instructor-led training
  • 1:1 doubt resolution sessions
  • Attend as many batches for Lifetime
  • Flexible Schedule
  • 02 Jun
  • TUE - FRI
  • 07:00 AM TO 09:00 AM IST (GMT +5:30)
  • 06 Jun
  • SAT - SUN
  • 08:00 PM TO 11:00 PM IST (GMT +5:30)
  • 14 Jun
  • SAT - SUN
  • 08:00 PM TO 11:00 PM IST (GMT +5:30)
  • 21 Jun
  • SAT - SUN
  • 08:00 PM TO 11:00 PM IST (GMT +5:30)
$808 10% OFF Expires in
$0

Corporate Training

  • Customized Learning
  • Enterprise grade learning management system (LMS)
  • 24x7 support
  • Strong Reporting

About Course

This is an Artificial Intelligence Engineer master’s course that is a comprehensive learning approach for mastering the domains of Artificial Intelligence, Data Science, Business Analytics, Business Intelligence, Python coding and Deep Learning with TensorFlow. Upon the completion of the training, you will be able to take on challenging roles in the Artificial Intelligence domain.

What will you learn in this Artificial Intelligence Engineer training?

  1. Introduction to Artificial Intelligence domain
  2. How Data science and Artificial Intelligence overlap
  3. Importance of Python coding for data analytics
  4. Efficient design of Machine Learning systems
  5. SAS tool for data analytics, modeling and visualization
  6. Working with Tableau interactive dashboard and reports
  7. MS Excel calculations, tables and formulae
  8. R Statistical computing for Data Science
  9. Building of Artificial Neural Networks and Statistical Models
  10. Deep Learning techniques and working with TensorFlow
  • Software Engineers and Data Analysts
  • Business Intelligence Professionals
  • SAS Developers wanting to learn the open-source technology
  • Those aspiring for a career in Data Science

There are no prerequisites for taking this Artificial Intelligence master’s course.

Artificial Intelligence is one of the hottest domains being heralded as the one with the ability to disrupt companies cutting across industry sectors. This Intellipaat Artificial Intelligence Engineer master’s course will equip you with all the necessary skills needed to take on challenging and exciting roles in the Artificial Intelligence, Data Science, Business Analytics, Python and R Statistical computing domains and grab the best jobs in the industry at top-notch salaries.

View More

Talk to us

Testimonials

John Chioles

Ritesh Bhagwat

Mr Yoga

Dileep & Ajay

Sagar

Ashok Guntupalli

Gajraj Singh

Artificial Intelligence Programmer at Accenture

Intellipaat offered fully updated learning material for the Artificial Intelligence training. The video quality and other course material were appreciable. I would definitely recommend Intellipaat AI training to my friends.

Alison Fischer

Artificial Intelligence Engineer at Capgemini

This Intellipaat AI master course is a training program that has been created by extensive inputs from the industry. Absolutely happy to learn from Intellipaat!

Ruchika Siyal

Data Scientist at Infosys

I liked the way Intellipaat has created the master program learning path for AI Engineer. Anybody looking to excel in Artificial Intelligence must take this course from Intellipaat.

Chithra Priya Srinivasan

Student

I do not understand how Intellipaat gets their hands on such wonderful trainers. I mean, my trainer was beyond awesome. Ask him a doubt, he will clear. He made us laugh during sessions. He was extremely knowledgeable and interactive. Wonderful Experience.

Rakesh Ramteke

Sr. Software Engineer at GlobalLogic

Initially, I was not believing in e-learning, but Intellipaat training course proved me wrong. I had wonderful learning experience with the trainer and the support team was always ready to take my concerns. Thanks for the course and support.

Course Content

Module 01 - Introduction to Data Science with R

1.1 What is Data Science
1.2 Significance of Data Science in today’s digitally-driven world, applications of Data Science, lifecycle of Data Science, components of the Data Science lifecycle
1.3 Introduction to big data and Hadoop, introduction to Machine Learning and Deep Learning,
1.4 Introduction to R programming and R Studio

Hands-on Exercise

1. Installation of R Studio
2. Implementing simple mathematical operations and logic using R operators, loops, if statements and switch cases.

Module 02 - Data Exploration

2.1 Introduction to data exploration
2.2 Importing and exporting data to/from external sources
2.3 What is data exploratory analysis, data importing, dataframes
2.4 working with dataframes, accessing individual elements, vectors and factors, operators, in-built functions, conditional, looping statements and user-defined functions, matrix, list and array.

Hands-on Exercise

1. Accessing individual elements of customer churn data
2. Modifying and extracting the results from the dataset using user-defined functions in R.

Module 03 - Data Manipulation

3.1 Need for Data Manipulation
3.2 Introduction to dplyr package
3.3 Selecting one or more columns with select() function, Filtering out records on the basis of a condition with filter() function, Adding new columns with the mutate() function, Sampling & Counting
3.4 Combining different functions with the pipe operator, Implementing sql like operations with sqldf.

Hands-on Exercise

1. Implementing dplyr
2. perform various operations for abstracting over how data is manipulated and stored.

Module 04 - Data Visualization

4.1 Introduction to visualization
4.2 Different types of graphs, Introduction to grammar of graphics & ggplot2 package, Understanding categorical distribution with geom_bar() function, understanding numerical distribution with geom_hist() function, building frequency polygons with geom_freqpoly(), making a scatter-plot with geom_pont() function
4.3 Multivariate analysis with geom_boxplot
4.4 Univariate Analysis with Bar-plot, histogram and Density Plot, multivariate distribution
4.5 Bar-plots for categorical variables using geom_bar(), adding themes with the theme() layer
4.6 Visualization with plotly package & building web applications with shinyR, frequency-plots with geom_freqpoly(), multivariate distribution with scatter-plots and smooth lines, continuous vs categorical with box-plots, subgrouping the plots
4.7 Working with co-ordinates and themes to make the graphs more presentable, Intro to plotly & various plots, visualization with ggvis package
4.8 Geographic visualization with ggmap(), building web applications with shinyR.

Hands-on Exercise

1. Creating data visualization to understand the customer churn ratio using charts using ggplot2
2. Plotly for importing and analyzing data into grids
3. Visualize tenure, monthly charges, total charges and other individual columns by using the scatter plot.

Module 05 - Introduction to Statistics

5.1 Why do we need Statistics?
5.2 Categories of Statistics, Statistical Terminologies, Types of Data, Measures of Central Tendency, Measures of Spread
5.3 Correlation & Covariance,Standardization & Normalization,Probability & Types of Probability, Hypothesis Testing, Chi-Square testing, ANOVA, normal distribution, binary distribution.

Hands-on Exercise

1. Building a statistical analysis model that uses quantifications, representations, experimental data for gathering
2. Reviewing, analyzing and drawing conclusions from data.

Module 06 - Machine Learning

6.1 Introduction to Machine Learning
6.2 Introduction to Linear Regression, predictive modeling with Linear Regression, simple Linear and multiple Linear Regression, concepts and formulas, assumptions and residual diagnostics in Linear Regression, building simple linear model
6.3 Predicting results and finding p-value, introduction to logistic regression
6.4 Comparing linear regression and logistics regression, bivariate & multi-variate logistic regression
6.5 Confusion matrix & accuracy of model, threshold evaluation with ROCR, Linear Regression concepts and detailed formulas, various assumptions of Linear Regression,residuals, qqnorm(), qqline(), understanding the fit of the model, building simple linear model, predicting results and finding p-value
6.6 understanding the summary results with Null Hypothesis, p-value & F-statistic,
building linear models with multiple independent variables.

Hands-on Exercise

1. Modeling the relationship within the data using linear predictor functions.
2. Implementing Linear & Logistics Regression in R by building model with ‘tenure’ as dependent variable and multiple independent variables.

Module 07 - Logistic Regression

7.1 Introduction to Logistic Regression
7.2 Logistic Regression Concepts, Linear vs Logistic regression, math behind Logistic Regression
7.3 Detailed formulas, logit function and odds, Bi-variate logistic Regression, Poisson Regression
7.4 Building simple “binomial” model and predicting result, confusion matrix and Accuracy, true positive rate, false positive rate, and confusion matrix for evaluating built model, threshold evaluation with ROCR
7.5 Finding the right threshold by building the ROC plot, cross validation & multivariate logistic regression, building logistic models with multiple independent variables
7.6 Real-life applications of Logistic Regression.

Hands-on Exercise

1. Implementing predictive analytics by describing the data
2. explaining the relationship between one dependent binary variable and one or more binary variables.
3. You will use glm() to build a model and use ‘Churn’ as the dependent variable.

Module 08 - Decision Trees & Random Forest

8.1 What is classification and different classification techniques
8.2 Introduction to Decision Tree
8.3 Algorithm for decision tree induction, building a decision tree in R
8.4 Creating a perfect Decision Tree, Confusion Matrix, Regression trees vs Classification trees
8.5 Introduction to ensemble of trees and bagging
8.6 Random Forest concept, implementing Random Forest in R
8.7 what is Naive Bayes, Computing Probabilities, Impurity Function – Entropy, understand the concept of information gain for right split of node
8.8 Impurity Function – Information gain, understand the concept of Gini index for right split of node
8.9 Impurity Function – Gini index, understand the concept of Entropy for right split of node, overfitting & pruning, pre-pruning, post-pruning, cost-complexity pruning, pruning decision tree and predicting values, find the right no of trees and evaluate performance metrics.

Hands-on Exercise

1. Implementing Random Forest for both regression and classification problems.
2. You will build a tree, prune it by using ‘churn’ as the dependent variable and build a Random Forest with the right number of trees,
3. using ROCR for performance metrics.

Module 09 - Unsupervised learning

9.1 What is Clustering & it’s Use Cases, what is K-means Clustering
9.2 What is Canopy Clustering
9.3 What is Hierarchical Clustering
9.4 Introduction to Unsupervised Learning
9.5 Feature extraction & clustering algorithms, k-means clustering algorithm
9.6 Theoretical aspects of k-means, and k-means process flow, K-means in R, implementing K-means on the data-set and finding the right no. of clusters using Scree-plot
9.7 Hierarchical clustering & Dendogram, understand Hierarchical clustering, implement it in R and have a look at Dendograms
9.8 Principal Component Analysis, explanation of Principal Component Analysis in detail, PCA in R, implementing PCA in R.

Hands-on Exercise

1. Deploying unsupervised learning with R to achieve clustering and dimensionality reduction
2. K-means clustering for visualizing and interpreting results for the customer churn data.

Module 10 - Association Rule Mining & Recommendation Engine

10.1 Introduction to association rule Mining & Market Basket Analysis
10.2 Measures of Association Rule Mining: Support, Confidence, Lift, Apriori algorithm & implementing it in R
10.3 Introduction to Recommendation Engine
10.4 User-based collaborative filtering & Item-Based Collaborative Filtering, implementing Recommendation Engine in R, user-Based and item-Based
10.5 Recommendation Use-cases

Hands-on Exercise

1. Deploying association analysis as a rule-based machine learning method,
2. Identifying strong rules discovered in databases with measures based on interesting discoveries.

Self Paced

Module 11 - Introduction to Artificial Intelligence

11.1 Introducing Artificial Intelligence and Deep Learning
11.2 what is an Artificial Neural Network, TensorFlow – computational framework for building AI models
11.3 Fundamentals of building ANN using TensorFlow, working with TensorFlow in R.

Module 12 - Time Series Analysis

12.1 What is Time Series
12.2 Techniques and applications, components of Time Series, moving average, smoothing techniques, exponential smoothing
12.3 Univariate time series models, multivariate time series analysis
12.4 Arima model
12.5 Time Series in R, sentiment analysis in R (Twitter sentiment analysis), text analysis.

Hands-on Exercise

1. Analyzing time series data
2. Sequence of measurements that follow a non-random order to identify the nature of phenomenon and to forecast the future values in the series.

Module 13 - Support Vector Machine - (SVM)

13.1 Introduction to Support Vector Machine (SVM)
13.2 Data classification using SVM
13.3 SVM Algorithms using Separable and Inseparable cases
13.4 Linear SVM for identifying margin hyperplane.

Module 14 - Naïve Bayes

14.1 What is Bayes theorem
14.2 What is Naïve Bayes Classifier
14.3 Classification Workflow
14.4 How Naive Bayes classifier works, Classifier building in Scikit-learn
14.5 Building a probabilistic classification model using Naïve Bayes, Zero Probability Problem.

Module 15 - Text Mining

15.1 Introduction to concepts of Text Mining
15.2 Text Mining use cases, understanding and manipulating text with ‘tm’ & ‘stringR’
15.3 Text Mining Algorithms, Quantification of Text
15.4 Term Frequency-Inverse Document Frequency (TF-IDF), After TF-IDF.

Case Study

01 – The Market Basket Analysis (MBA) case study

1.1 This case study is associated with the modeling technique of Market Basket Analysis where you will learn about loading of data, various techniques for plotting the items and running the algorithms.
1.2 It includes finding out what are the items that go hand in hand and hence can be clubbed together.
1.3 This is used for various real world scenarios like a supermarket shopping cart and so on.

02 – Logistic Regression Case Study

2.1 In this case study you will get a detailed understanding of the advertisement spends of a company that will help to drive more sales
2.2 You will deploy logistic regression to forecast the future trends
2.3 Detect patterns, uncover insights and more all through the power of R programming.
2.4 Due to this the future advertisement spends can be decided and optimized for higher revenues.

03 – Multiple Regression Case Study

3.1 You will understand how to compare the miles per gallon (MPG) of a car based on the various parameters.
3.2 You will deploy multiple regression and note down the MPG for car make, model, speed, load conditions, etc.
3.3 It includes the model building, model diagnostic, checking the ROC curve, among other things.

04 – Receiver Operating Characteristic (ROC) case study

4.1 You will work with various data sets in R,
4.2 Deploy data exploration methodologies,
4.3 Build scalable models
4.4 Predict the outcome with highest precision, diagnose the model that you have created with various real world data, check the ROC curve and more.

What projects I will be working in this Data Science certification course?

Project 01 – Market Basket Analysis

Domain – Inventory Management

Problem Statement – As a new manager in the company, you are assigned the task of increasing cross selling

Topics  – Association Rule Mining, Data Extraction, Data Manipulation

Highlights

  • Performing association rule mining
  • Understanding where to implement Apriori Algorithm
  • Setting association rules with respect to confidence

Project 02  – Credit Card Fraud Detection

Domain – Banking

Problem Statement – Analysis of probability of being involved in a fraudulent operation

Topics – Algorithms, V17 Predictor, Data Visualization, R Language

Highlights

  • Understanding working with the credit card dataset
  • Performing data analysis on various labels in the data
  • Making use of V17 as predictor and using V14 for analysis
  • Plotting score performance with respect to variables

Project 03 – Data Cleaning using Census Dataset

Domain – Government

Problem Statement – Performing Data Cleansing operation on a raw dataset

Topics – Data Analysis, Data preprocessing, Cleaning Ops, Data Visualization, R Language

Highlights

  • Understanding working with the census dataset
  • Changing around various with respect to a label to perform analysis
  • Creation of functions to eliminate values which are not required
  • Verifying the completion of data cleansing operation

Project 04 – Loan Approval Prediction

Domain -Banking

Problem Statement – Prediction of approval rate of a loan by using multiple labels

Topics – Data Analysis, Data preprocessing, Cleaning Ops, Data Visualization, R Language

Highlights

  • Performing Data Preprocessing
  • Building a model and applying PCA
  • Building a Naïve Bayes model on the training dataset
  • Prediction of values after performing analysis

Project 05 – Book Recommendation System

Domain – E-Commerce

Problem Statement – Creating a model, which can recommend books, based on user interest

Topics – Data Cleaning, Data Visualization, User Based Collaborative Filtering

Highlights

  • Finding the most popular books using various techniques
  • Creating a Book Recommender model using User Based Collaborative Filtering

Project 06 – Netflix Recommendation System

Domain – E-Commerce

Problem Statement Simulating the Netflix Recommendation System

Topics – Data Cleaning, Data Visualization, Distribution, Recommender Lab

Highlights

  • Working with raw data
  • Using the Recommender Lab library in R
  • Making use of real data from Netflix

Project 07 – Creating a Pokemon Game using Machine Learning

Domain – Gaming

Problem Statement – Creating a game engine for Pokemon using Machine Learning

Topics – Decision Tress, Regression, Data Cleaning, Data Visualization

Highlights

  • Predicting which Pokemon will win based Attack vs Defense
  • Finding whether a Pokemon is legendary using Decision Trees
  • Understanding the dynamics of decision making in Machine Learning

Case Study 01 – Introduction to R Programming

Problem Statement – Working with various operators in R

Topics – Arithmetic Operators, Relational Operators, Logical Operators

Highlights

  • Working with Arithmetic Operators
  • Working with Relational Operators
  • Working with Logical Operators

Case Study 02 – Solving Customer Churn using Data Exploration

Problem Statement – Understanding what to do to reduce customer churn using Data Exploration

Topics – Data Exploration

Highlights

  • Extracting Individual columns
  • Creating and applying filters to manipulate data
  • Using loops for redundant operations

Case Study 03 – Creating Data Structures in R

Problem Statement – Implementing various Data Structures in R for various scenarios

Topics – Vectors, list, Matrix, Array

Highlights

  • Creating and Implementing Vectors
  • Understanding Lists
  • Using Arrays to store Matrices
  • Creating and implementing Matrices

Case Study 04 – Implementing SVD in R

Problem Statement – Understanding the use Single Value Decomposition in R by making use of the MovieLense Dataset

Topics – 5-fold cross validation, Real Rating Matrix

Highlights

  • Creating a custom  recommended movie set for each user
  • Creating User Based Collaborative Filtering Model
  • Creating RealRatingMatrix for Movie recommendation

Case Study 05 – Time Series Analysis

Problem Statement – Performing TSA and understanding concepts of ARIMA for a given scenario

Topics – Time Series Analysis, R Language, Data Visualization, ARIMA model

Highlights

  • Understand how to fit an ARIMA model
  • Plotting PACF charts and finding optimal parameters
  • Building the ARIMA model
  • Prediction of values after performing analysis

Module 01 - Python Environment Setup and Essentials

1.1 Introduction to Python Language
1.2 Features, the advantages of Python over other programming languages
1.3 Python installation – Windows, Mac & Linux distribution for Anaconda Python
1.4 Deploying Python IDE
1.5 Basic Python commands, data types, variables, keywords and more

Hands-on Exercise – Installing Python Anaconda for the Windows, Linux and Mac.

Module 02 - Python language Basic Constructs

2.1 Built-in data types in Python
2.2 Learn  classes, modules, Str(String), Ellipsis Object, Null Object, Ellipsis, Debug
2.3 Basic operators, comparison, arithmetic, slicing and slice operator, logical, bitwise
2.4 Loop and control statements while, for, if, break, else, continue.

Hands-on Exercise –
1. Write your first Python program
2. Write a Python Function (with and without parameters)
3. Use Lambda expression
4. Write a class
5. Create a member function and a variable
6. create an object
7. Write a for loop

Module 03 - OOP concepts in Python

3.1 How to write OOP concepts program in Python
3.2 Connecting to a database
3.3 Classes and objects in Python
3.4 OOPs paradigm, important concepts in OOP like polymorphism, inheritance, encapsulation
3.5 Python functions, return types and parameters
3.6 Lambda expressions

Hands-on Exercise –
1. Creating an application which helps to check balance, deposit money and withdraw the money using the concepts of OOPS.

Module 04 - Database connection

4.1 Understanding the Database, need of database
4.2 Installing MySQL on windows
4.3 Understanding Database connection using Python.

Hands-on Exercise – Demo on Database Connection using python and pulling the data.

Module 05 - NumPy for mathematical computing

5.1 Introduction to arrays and matrices
5.2 Broadcasting of array math, indexing of array
5.3 Standard deviation, conditional probability, correlation and covariance.

Hands-on Exercise –
1. How to import NumPy module
2. Creating array using ND-array
3. Calculating standard deviation on array of numbers
4. Calculating correlation between two variables.

Module 06 - SciPy for scientific computing

6.1 Introduction to SciPy
6.2 Functions building on top of NumPy, cluster, linalg, signal, optimize, integrate, subpackages, SciPy with Bayes Theorem.

Hands-on Exercise –
1. Importing of SciPy
2. Applying the Bayes theorem on the given dataset.

Module 07 - Matplotlib for data visualization

7.1 How to plot graph and chart with Python
7.2 Various aspects of line, scatter, bar, histogram, 3D, the API of MatPlotLib, subplots.

Hands-on Exercise –
1. Deploying MatPlotLib for creating Pie, Scatter, Line, Histogram.

Module 08 - Pandas for data analysis and machine learning

8.1 Introduction to Python dataframes
8.2 Importing data from JSON, CSV, Excel, SQL database, NumPy array to dataframe
8.3 Various data operations like selecting, filtering, sorting, viewing, joining, combining

Hands-on Exercise –
1. Working on importing data from JSON files
2. Selecting record by a group
3. Applying filter on top, viewing records

Module 09 - Exception Handling

9.1 Introduction to Exception Handling
9.2 Scenarios in Exception Handling with its execution
9.3 Arithmetic exception
9.4 RAISE of Exception
9.5 What is Random List, running a Random list on Jupyter Notebook
9.6 Value Error in Exception Handling.

Hands-on Exercise –
1. Demo on Exception Handling with an Industry-based Use Case.

Module 10 - Multi Threading & Race Condition

10.1 Introduction to Thread, need of threads
10.2 What are thread functions
10.3 Performing various operations on thread like joining a thread, starting a thread, enumeration in a thread
10.4 Creating a Multithread, finishing the multithreads.
10.5 Understanding Race Condition, lock and Synchronization.

Hands-on Exercise –
1. Demo on Starting a Thread and a Multithread and then perform multiple operations on them.

Module 11 - Packages and Functions

11.1 Intro to modules in Python, need of modules
11.2 How to import modules in python
11.3 Locating a module, namespace and scoping
11.4 Arithmetic operations on Modules using a function
11.5 Intro to Search path, Global and local functions, filter functions
11.6 Python Packages, import in packages, various ways of accessing the packages
11.7 Decorators, Pointer assignments, and Xldr.

Hands-on Exercise –
1. Demo on Importing the modules and performing various operation on them using arithmetic functions
2. Importing various packages and accessing them and then performing different operations on them.

Module 12 - Web scraping with Python

12.1 Introduction to web scraping in Python
12.2 Installing of beautifulsoup
12.3 Installing Python parser lxml
12.4 Various web scraping libraries, beautifulsoup, Scrapy Python packages
12.5 Creating soup object with input HTML
12.6 Searching of tree, full or partial parsing, output print

Hands-on Exercise –
1. Installation of Beautiful soup and lxml Python parser
2. Making a soup object with input HTML file
3. Navigating using Py objects in soup tree.

What projects I will be working in this Python certification course?

Project 01 : Analyzing the Naming Pattern Using Python

Industry : General

Problem Statement : How to analyze the trends and the most popular baby names

Topics : In this Python project, you will work with the United States Social Security Administration (SSA) which has made data on the frequency of baby names from 1880 to 2016 available. The project requires analyzing the data considering different methods. You will visualize the most frequent names, determine the naming trends and come up with the most popular names for a certain year.

Highlights :

  • Analyzing data using Pandas Library
  • Deploying Data Frame Manipulation
  • Bar and box plots with Matplotlib

Project 02 : – Python Web Scraping for Data Science

In this project, you will be introduced to the process of web scraping using Python. It involves installation of Beautiful Soup, web scraping libraries, working on common data and page format on the web, learning the important kinds of objects, Navigable String, deploying the searching tree, navigation options, parser, search tree, searching by CSS class, list, function and keyword argument.

Project 03 : Predicting Customer Churn in Telecom Company

Industry – Telecommunications

Problem Statement – How to increase the profitability of a telecom major by reducing the churn rate

Topics :In this project, you will work with the telecom company’s customer dataset. This dataset includes subscribing telephone customer’s details. Each of the column has data on phone number, call minutes during various times of the day, the charges incurred, lifetime account duration and whether the customer has churned some services by unsubscribing it. The goal is to predict whether a customer will eventually churn or not.

Highlights :

  • Deploy Scikit-Learn ML library
  • Develop code with Jupyter Notebook
  • Build a model using performance matrix

Module 01 - Introduction to Machine Learning

1.1 Need of Machine Learning
1.2 Introduction to Machine Learning
1.3 Types of Machine Learning, such as supervised, unsupervised and reinforcement learning, why Machine Learning with Python and applications of Machine Learning.

Module 02 - Supervised Learning and Linear Regression

2.1 Introduction to supervised learning, types of supervised learning, such as regression and classification
2.2 Introduction to regression,
2.3 simple linear regression,
2.4 Multiple linear regression, assumptions in linear regression
2.5 Math behind linear regression.

Hands-on Exercise

1. Implementing linear regression from scratch with Python
2. Using Python library Scikit-learn to perform simple linear regression and multiple linear regression
3. Implementing train–test split and predicting the values on the test set.

Module 03 - Classification and Logistic Regression

3.1 Introduction to classification
3.2 Linear regression vs Logistic regression
3.3 Math behind logistic regression, detailed formulas, log it function and odds, confusion matrix and accuracy, true positive rate, false positive rate, and threshold evaluation with ROCR.

Hands-on Exercise

1. Implementing logistic regression from scratch with Python
2. Using Python library Scikit-learn to perform simple logistic regression and multiple logistic regression
3. Building a confusion matrix to find out accuracy, true positive rate, and false positive rate.

Module 04 - Decision Tree and Random Forest

4.1 Introduction to tree-based classification
4.2 Understanding a decision tree, impurity function, entropy, to understand the concept of information gain for the right split of node, impurity function, information gain
4.3 Understand the concept of information gain for the right split of node, impurity function, Gini index, Understand the concept of Gini index for the right split of node. Understand Overfitting, pruning, Pre-Pruning, Post-Pruning and cost-complexity pruning
4.4 Introduction to ensemble techniques, understanding bagging, introduction to random forests, and finding the right number of trees in a random forest.

Hands-on Exercise

1. Implementing a decision tree from scratch in Python
2. Using Python library Scikit-learn to build a decision tree and a random forest.
3. Visualizing the tree and changing the hyper parameters in the random forest.

Module 05 - Naïve Bayes and Support Vector Machine (self paced)

5.1 Introduction to probabilistic classifiers,
5.2 Understanding Naïve Bayes, math behind the Bayes theorem
5.3 Understanding a support vector machine (SVM)
5.4 Kernel functions in SVM, and math behind SVM.

Hands-on Exercise

1. Using Python library Scikit-learn to build a Naïve Bayes classifier and a support vector classifier.

Module 06 - Unsupervised Learning

6.1 Types of unsupervised learning, such as clustering and dimensionality reduction, types of clustering
6.2 Introduction to k-means clustering
6.3 Math behind k-means
6.4 Dimensionality reduction with PCA.

Hands-on Exercise

1. Using Python library Scikit-learn to implement K-means clustering
2. Implementing PCA (principal component analysis) on top of a dataset.

Module 07 - Natural Language Processing and Text Mining (self paced)

7.1 Introduction to Natural Language Processing (NLP)
7.2 Introduction to text mining
7.3 Importance and applications of text mining
7.4 How NPL works with text mining
7.5 Writing and reading to word files
7.6 OS modules, Natural Language Toolkit (NLTK) environment,
7.7  Text mining: its cleaning and pre-processing and text classification.

Hands-on Exercise

1. Learning Natural Language Toolkit and NLTK Corpora
2. Reading and writing .txt files from/to a local drive
3. Reading and writing .docx files from/to a local drive.

Module 08 - Introduction to Deep Learning

8.1 Introduction to Deep Learning with neural networks
8.2 Biological neural network vs artificial neural network
8.3 Understanding perception learning algorithm, introduction to Deep Learning frameworks, and Tensor Flow constants, variables and place-holders.

Module 09 - Time Series Analysis (Self-paced)

9.1 What is time series?, its techniques and applications
9.2 Time series components
9.3 Moving average, smoothing techniques, exponential smoothing
9.4 Univariate time series models
9.5 Multivariate time series analysis
9.6 ARIMA model, time series in Python
9.7 Sentiment analysis in Python (Twitter sentiment analysis), and text analysis.

Hands-on Exercise

1. Analyzing time series data
2. The sequence of measurements that follow a non-random order to recognize the nature of phenomenon
3. Forecasting the future values in the series.

What projects and case studies I will be working on in this Machine Learning certification course?

Project 01: Analyzing the trends of COVID-19 with Python

Industry: Analytics

Problem Statement: Understanding, the trend of COVID 19 spread and if the restrictions imposed by governments around the world has helped us curb the COVID 19 cases and by what degree

Topics: In this project we will use Data Science and Python to perform data visualization to understand the data. We will use on COVID 19 data set to do Time Series Analysis in order to make a prediction about future cases if the current trend as observed thus far continues.

Highlights:

  • Using pandas to accumulate data from multiple data files
  • Using plotly (visualization library) to create interactive visualizations
  • Using facebooks prophet library to make time-series models
  • Visualizing the prediction by combining these technologies

Project 02 – Customer Churn Classification

Topics – This is a real-world project that gives you hands-on experience in working with most of the Machine Learning algorithms.

The main components of the project include the following:

  • Manipulating data in order to gain meaningful insights.
  • Visualizing data to figure out trends and patterns among different factors.
  • Implementing these algorithms: linear regression, decision tree, and Naïve Bayes.

Project 03 – Recommendation for Movie, Summary

Topics – This is a real-world project that gives you hands-on experience in working with a movie recommender system. Depending on what movies are liked by a particular user, you will be in a position to provide data-driven recommendations. This project requires you to deeply understand information filtering, recommender systems, user ‘preference’, and more. You will exclusively work on data related to user details, movie details, and others.

The main components of the project include the following:

  • Recommendation for movies
  • Two types of predictions: Rating prediction and item prediction
  • Important approaches: Memory-based and model-based
  • Knowing user-based methods in K-Nearest Neighbor
  • Understanding the item-based method
  • Matrix factorization
  • Decomposition of singular value
  • Data Science project discussion
  • Collaboration filtering
  • Business variables overview

Case Study 01 – Decision Tree

Topics – To understand the structure of a dataset (PIMA Indians Diabetes database) and create a decision tree model based on it by using Scikit-learn

Case Study 02 – Insurance Cost Prediction (Linear Regression)

Topics – To understand the structure of a medical insurance dataset, implement both simple and multiple linear regression, and predict values

Case Study 03 – Diabetes Classification (Logistic Regression)

Topics – To understand the structure of a dataset (PIMA Indians Diabetes dataset), and implement multiple logistic regression and classify[I3] . Fit your model on the test and train data for prediction and evaluate your model using confusion matrix and then visualize it

Case Study 04 – Random Forest

Topics – To create a model that would help in classifying whether a patient is ‘Normal’, ‘Suspected to have disease,’ or in actuality ‘Has the disease’ on the ‘Cardiotocography’ dataset

Case Study 05 – Principal Component Analysis (PCA)

Topics – Read the sample iris dataset given to you, use PCA to figure out the number of most important principal features, and then reduce the number of features using PCA. Train and test the Random Forest Classifier algorithm to check if reducing the number of dimensions is causing the model to perform poorly. Figure out the most optimal number that produces good quality results and predicts accuracy

Case Study 06 – K-means Clustering

Topics

  • Analyze data
  • Extract useful columns from the dataset
  • Visualize data
  • Find out the appropriate number of groups or clusters for data to be segmented into (using the elbow method)
  • Using k-means clustering, segment data into k groups (k is found in the previous step)
  • Visualize a scatter plot of clusters, and a lot more

Model 01 - Introduction to Deep Learning and Neural Networks

1.1 Field of machine learning, its impact on the field of artificial intelligence
1.2 The benefits of machine learning w.r.t. Traditional methodologies
1.3 Deep learning introduction and how it is different from all other machine learning methods
1.4 Classification and regression in supervised learning
1.5 Clustering and association in unsupervised learning, algorithms that are used in these categories
1.6 Introduction to ai and neural networks
1.7 Machine learning concepts
1.8 Supervised learning with neural networks
1.9 Fundamentals of statistics, hypothesis testing, probability distributions, and hidden markov models.

Model 02 - Multi-layered Neural Networks

2.1 Multi-layer network introduction, regularization, deep neural networks
2.2 Multi-layer perceptron
2.3 Overfitting and capacity
2.4 Neural network hyperparameters, logic gates
2.5 Different activation functions used in neural networks, including relu, softmax, sigmoid and hyperbolic functions
2.6 Back propagation, forward propagation, convergence, hyperparameters, and overfitting.

Model 03 - Artificial Neural Networks and Various Methods

3.1 Various methods that are used to train artificial neural networks
3.2 Perceptron learning rule, gradient descent rule, tuning the learning rate, regularization techniques, optimization techniques
3.3 Stochastic process, vanishing gradients, transfer learning, regression techniques,
3.4 Lasso l1 and ridge l2, unsupervised pre-training, xavier initialization.

Module 04 - Deep Learning Libraries

4.1 Understanding how deep learning works
4.2 Activation functions, illustrating perceptron, perceptron training
4.3 multi-layer perceptron, key parameters of perceptron;
4.4 Tensorflow introduction and its open-source software library that is used to design, create and train
4.5 Deep learning models followed by google’s tensor processing unit (tpu) programmable ai
4.6 Python libraries in tensorflow, code basics, variables, constants, placeholders
4.7 Graph visualization, use-case implementation, keras, and more.

Module 05 - Keras API

5.1 Keras high-level neural network for working on top of tensorflow
5.2 Defining complex multi-output models
5.3 Composing models using keras
5.3 Sequential and functional composition, batch normalization
5.4 Deploying keras with tensorboard, and neural network training process customization.

Module 06 - TFLearn API for TensorFlow

6.1 Using tflearn api to implement neural networks
6.2 Defining and composing models, and deploying tensorboard

Module 07 - Dnns (deep neural networks)

7.1 Mapping the human mind with deep neural networks (dnns)
7.2 Several building blocks of artificial neural networks (anns)
7.3 The architecture of dnn and its building blocks
7.4 Reinforcement learning in dnn concepts, various parameters, layers, and optimization algorithms in dnn, and activation functions.

Module 08 - Cnns (convolutional neural networks)

8.1 What is a convolutional neural network?
8.2 Understanding the architecture and use-cases of cnn
8.3‘What is a pooling layer?’ how to visualize using cnn
8.4 How to fine-tune a convolutional neural network
8.5 What is transfer learning?
8.6 Understanding recurrent neural networks, kernel filter, feature maps, and pooling, and deploying convolutional neural networks in tensorflow.

Module 09 - Rnns (recurrent neural networks)

9.1 Introduction to the rnn model
9.2 Use cases of rnn, modeling sequences
9.3 Rnns with back propagation
9.4 Long short-term memory (lstm)
9.5 Recursive neural tensor network theory, the basic rnn cell, unfolded rnn,  dynamic rnn
9.6 Time-series predictions.

Module 10 - Gpu in deep learning

10.1 Gpu’s introduction, ‘how are they different from cpus?,’ the significance of gpus
10.2 Deep learning networks, forward pass and backward pass training techniques
10.3 Gpu constituent with simpler core and concurrent hardware.

Module 11- Autoencoders and restricted boltzmann machine (rbm)

11.1 Introduction  rbm and autoencoders
11.2 Deploying rbm for deep neural networks, using rbm for collaborative filtering
11.3 Autoencoders features and applications of autoencoders.

Module 12 - Deep learning applications

12.1 Image processing
12.2 Natural language processing (nlp) – Speech recognition, and video analytics.

Module 13 - Chatbots

13.1 Automated conversation bots leveraging any of the following descriptive techniques:  Ibm watson, Microsoft’s luis, Open–closed domain bots,
13.2 Generative model, and the sequence to sequence model (lstm).

What projects I will be working on during this AI online course?

Project 01: Image Recognition with TensorFlow

Industry: Internet Search

Problem Statement: Creating a Deep Learning model to identify the right object on the Internet as per the user search for the corresponding image

Description: In this project, you will learn how to build a convolutional neural network using Google TensorFlow. You will do the visualization of images using training, providing input images, losses, and distributions of activations and gradients. You will learn to break each image into manageable tiles and input them to the convolutional neural network for the desired result.

Highlights:

  • Constructing a convolutional neural network using TensorFlow
  • Convolutional, dense, and pooling layers of CNNs
  • Filtering images based on user queries

Project 02: Building an AI-based Chatbot using IBM watson LAB

Industry: Ecommerce

Problem Statement: Building a chatbot using Artificial Intelligence

Description: In this project, by understanding the customer needs, you will be able to offer the right services through Artificial Intelligence chatbots. You will learn how to create the right artificial neural network with the right amount of layers to ensure that the customer queries are comprehensible to the Artificial Intelligence chatbot. This will help you understand Natural Language Processing, going beyond keywords, data parsing, and providing the right solutions.

Highlights:

  • Breaking user queries into components
  • Building neural networks with TensorFlow
  • Understanding Natural Language Processing

Project 03: Ecommerce Product Recommendation

Industry: Ecommerce

Problem Statement: Recommending the right products to customers using Artificial Intelligence with TensorFlow

Description: This project involves working with recommender systems to provide the right product recommendation to customers with TensorFlow. You will learn how to use Artificial Intelligence to check for users’ past buying habits, find out the products that go hand-in-hand, and recommend the best products that can be bought together with a particular product.

Highlights:

  • Building neural networks with TensorFlow
  • Looking at huge amounts of data and gaining insights
  • Building a recommendation engine with TensorFlow Graph

Overview of Natural Language Processing and Text Mining

Introduction to Natural Language Processing (NLP), Introduction to Text Mining, importance and applications of Text Mining, how NPL works with Text Mining, writing and reading to word files, OS Module, Natural Language Toolkit (NLTK) Environment.

Hands-on Exercise: Learning Natural Language Toolkit and NLTK Corpora, reading and writing .txt files to/from local drive, reading and writing .docx Files to/from local drive.

Text Mining, Cleaning, and Pre-processing

Various Tokenizers, Tokenization, Frequency Distribution, Stemming, POS Tagging, Lemmatization, Bigrams, Trigrams & Ngrams, Lemmatization, Entity Recognition.

Hands-on Exercise: Learning Word Tokenization with Python regular expressions, Sentence Tokenizers, Stopword Removal, Bigrams, Trigrams, and Ngrams, Named Entity Recognition, and POS Tagging.

Text Classification

Overview of Machine Learning, Words, Term Frequency, Count Vectorizer, Inverse Document Frequency, Text conversion, Confusion Matrix, Naiive Bayes Classifier.

Hands-on Exercise: Demonstration of Count Vectorizer, Words, Term Frequency, Inverse Document Frequency, Text conversion, text classification, and Confusion Matrix.

Sentence Structure, Sequence Tagging, Sequence Tasks, and Language Modeling

Language Modeling, Sequence Tagging, Sequence Tasks, Predicting Sequence of Tags, Syntax Trees, Context-Free Grammars, Chunking, Automatic Paraphrasing of Texts, Chinking.

Hands-on Exercise: Demonstration of Syntax Trees, Chunking, Automatic Paraphrasing of Texts, and Chinking.

Introduction to Semantics and Vector Space Models

Distributional Semantics, Traditional Models, Tools for sentence and word embeddings, an overview of Topic Models.

Hands-on Exercise: Embedding word and sentence.

Dialog Systems

Introduction to task-oriented Dialog Systems, Natural Language Understanding, Dialog Manager.

Hands-on Exercise: Design your own Dialog System.

What projects I will be working on this NLP Training Using Python?

Project: Analyze Movie Review Data with NLP

Industry: Entertainment

Problem Statement: Perform sentiment analysis on a given dataset to analyze movie reviews

Project Description: In this project, as an NLP engineer, your job is to pre-process the data using tokenization and lemmatization and then develop an understanding of the different components of the data by identifying different parts of the speech and named entities in the text. After having a sufficient understanding of the attributes and syntactic structure of the text, perform a sentimental analysis task on the data by classifying whether movie reviews are positive or negative.

Highlights:

In this project, you will edit the dataset “Movie Reviews“, which is included in NLTK Corpus. The dataset contains multiple positive and negative reviews retrieved from the imdb.com.

Introduction to SAS

Installation and introduction to SAS, how to get started with SAS, understanding different SAS windows, how to work with data sets, various SAS windows like output, search, editor, log and explorer and understanding the SAS functions, which are various library types and programming files

SAS Enterprise Guide

How to import and export raw data files, how to read and subset the data sets, different statements like SET, MERGE and WHERE

Hands-on Exercise: How to import the Excel file in the workspace and how to read data and export the workspace to save data

SAS Operators and Functions

Different SAS operators like logical, comparison and arithmetic, deploying different SAS functions like Character, Numeric, Is Null, Contains, Like and Input/Output, along with the conditional statements like If/Else, Do While, Do Until and so on

Hands-on Exercise: Performing operations using the SAS functions and logical and arithmetic operations

Compilation and Execution

Understanding about input buffer, PDV (backend) and learning what is Missover

Using Variables

Defining and using KEEP and DROP statements, apply these statements and formats and labels in SAS

Hands-on Exercise: Use KEEP and DROP statements

Creation and Compilation of SAS Data Sets

Understanding the delimiter, dataline rules, DLM, delimiter DSD, raw data files and execution and list input for standard data

Hands-on Exercise: Use delimiter rules on raw data files

SAS Procedures

Various SAS standard procedures built-in for popular programs: PROC SORT, PROC FREQ, PROC SUMMARY, PROC RANK, PROC EXPORT, PROC DATASET, PROC TRANSPOSE, PROC CORR, etc.

Hands-on Exercise: Use SORT, FREQ, SUMMARY, EXPORT and other procedures

Input Statement and Formatted Input

Reading standard and non-standard numeric inputs with formatted inputs, column pointer controls, controlling while a record loads, line pointer control/absolute line pointer control, single trailing, multiple IN and OUT statements, dataline statement and rules, list input method and comparing single trailing and double trailing

Hands-on Exercise:  Read standard and non-standard numeric inputs with formatted inputs, control while a record loads, control a line pointer and write multiple IN and OUT statements

SAS Format

SAS Format statements: standard and user-written, associating a format with a variable, working with SAS Format, deploying it on PROC data sets and comparing ATTRIB and Format statements

Hands-on Exercise: Format a variable, deploy format rule on PROC data set and use ATTRIB statement

SAS Graphs

Understanding PROC GCHART, various graphs, bar charts: pie, bar and 3D and plotting variables with PROC GPLOT

Hands-on Exercise: Plot graphs using PROC GPLOT and display charts using PROC GCHART

Interactive Data Processing

SAS advanced data discovery and visualization, point-and-click analytics capabilities and powerful reporting tools

Data Transformation Function

Character functions, numeric functions and converting variable type

Hands-on Exercise: Use functions in data transformation

Output Delivery System (ODS)

Introduction to ODS, data optimization and how to generate files (rtf, pdf, html and doc) using SAS

Hands-on Exercise: Optimize data and generate rtf, pdf, html and doc files

SAS Macros

Macro Syntax, macro variables, positional parameters in a macro and macro step

Hands-on Exercise: Write a macro and use positional parameters

PROC SQL

SQL statements in SAS, SELECT, CASE, JOIN and UNION and sorting data

Hands-on Exercise: Create SQL query to select and add a condition and use a CASE in select query

Advanced Base SAS

Base SAS web-based interface and ready-to-use programs, advanced data manipulation, storage and retrieval and descriptive statistics

Hands-on Exercise: Use web UI to do statistical operations

Summarization Reports

Report enhancement, global statements, user-defined formats, PROC SORT, ODS destinations, ODS listing, PROC FREQ, PROC Means, PROC UNIVARIATE, PROC REPORT and PROC PRINT

Hands-on Exercise: Use PROC SORT to sort the results, list ODS, find mean using PROC Means and print using PROC PRINT

What projects I will be working on this SAS training?

Project 1: Categorization of Patients Based on the Count of Drugs for Their Therapy

Domain: Healthcare

Objective: This project aims to find out descriptive statistics and subset for specific clinical data problems. It will give them brief insight about Base SAS procedures and data steps.

Problem Statement:

Count the number of patients,

  1. Who were ever on at least one of the four drugs
  2. Who were ever on each of the four drugs
  3. Who had never been on any drug

Output should be four datasets

  1. TYPA – Contains the list of patients from problem 1
  2. TYPB – Contains the list of patients from problem 2
  3. TYPC – Contains the list of patients from problem 3
  4. SUMMARY – Contains the summary of counts for each of three problems

Project 2: Build Revenue Projections Reports

Domain: Sales

Objective: This project will give you hands-on experience in working with the SAS data analytics and business intelligence tool. You will be working on the data entered in a business enterprise setup and will aggregate, retrieve and manage that data. You will learn to create insightful reports and graphs and come up with statistical and mathematical analysis to scientifically predict the revenue projection for a particular future time frame. Upon the completion of the project, you will be well-versed in the practical aspects of data analytics, predictive modeling and data mining.

Project 3: Impact of Pre-paid Plans on the Preferences of Investors

Domain: Finance Market

Objective: The project aims to find the most impacting factors in preferences of pre-paid model; it also identifies which all are the variables highly correlated with impacting factors.

Problem Statement:

  • The project aims to identify various reasons for pre-paid model preference and non-preference among the investors, to understand the penetration of the pre-paid model in the brokerage firms and, to identify the pre-paid scheme advantages and disadvantages and also to identify brand-wise market share. In addition to this, the project also looks to identify various insights that would help a newly established brand to foray deeper into the market on a large scale.

Project 4:K-Means Cluster Analysis on Iris Dataset

Domain: Analytics

Objective: K-Means cluster analysis on Iris dataset to predict about the class of a flower using its petal’s dimensions

Requirements:

  • Using the famous Iris dataset, predict the class of a flower
  • Perform K-Means cluster analysis

Entering Data

Introduction to Excel spreadsheet, learning to enter data, filling of series and custom fill list, editing and deleting fields.

Referencing in Formulas

Learning about relative and absolute referencing, the concept of relative formulae, the issues in relative formulae, creating of absolute and mixed references and various other formulae.

Name Range

Creating names range, using names in new formulae, working with the name box, selecting range, names from a selection, pasting names in formulae, selecting names and working with Name Manager.

Understanding Logical Functions

the various logical functions in Excel, the If function for calculating values and displaying text, nested If functions, VLookUp and IFError functions.

Getting started with Conditional Formatting

Learning about conditional formatting, the options for formatting cells, various operations with icon sets, data bars and color scales, creating and modifying sparklines.

Advanced-level Validation

multi-level drop down validation, restricting value from list only, learning about error messages and cell drop down.

Important Formulas in Excel

Introduction to the various formulae in Excel like Sum, SumIF & SumIFs, Count, CountA, CountIF and CountBlank, Networkdays, Networkdays International, Today & Now function, Trim (Eliminating undesirable spaces), Concatenate (Consolidating columns)

Working with Dynamic table

Introduction to dynamic table in Excel, data conversion, table conversion, tables for charts and VLOOKUP.

Data Sorting

Sorting in Excel, various types of sorting including, alphabetical, numerical, row, multiple column, working with paste special, hyperlinking and using subtotal.

Data Filtering

The concept of data filtering, understanding compound filter and its creation, removing of filter, using custom filter and multiple value filters, working with wildcards.

Chart Creation

Creation of Charts in Excel, performing operations in embedded chart, modifying, resizing, and dragging of chart.

Various Techniques of Charting

Introduction to the various types of charting techniques, creating titles for charts, axes, learning about data labels, displaying data tables, modifying axes, displaying gridlines and inserting trendlines, textbox insertion in a chart, creating a 2-axis chart, creating combination chart.

Pivot Tables in Excel

The concept of Pivot tables in Excel, report filtering, shell creation, working with Pivot for calculations, formatting of reports, dynamic range assigning, the slicers and creating of slicers.

Ensuring Data and File Security

Data and file security in Excel, protecting row, column, and cell, the different safeguarding techniques.

Getting started with VBA Macros

Learning about VBA macros in Excel, executing macros in Excel, the macro shortcuts, applications, the concept of relative reference in macros.

Core concepts of VBA

In-depth understanding of Visual Basic for Applications, the VBA Editor, module insertion and deletion, performing action with Sub and ending Sub if condition not met.

Ranges and Worksheet in VBA

Learning about the concepts of workbooks and worksheets in Excel, protection of macro codes, range coding, declaring a variable, the concept of Pivot Table in VBA, introduction to arrays, user forms, getting to know how to work with databases within Excel.

IF condition

Learning how the If condition works and knowing how to apply it in various scenarios, working with multiple Ifs in Macro.

Loops in VBA

Understanding the concept of looping, deploying looping in VBA Macros.

Debugging in VBA

Studying about debugging in VBA, the various steps of debugging like running, breaking, resetting, understanding breakpoints and way to mark it, the code for debugging and code commenting.

Messaging in VBA

The concept of message box in VBA, learning to create the message box, various types of message boxes, the IF condition as related to message boxes.

Practical Projects in VBA

Mastering the various tasks and functions using VBA, understanding data separation, auto filtering, formatting of report, combining multiple sheets into one, merging multiple files together.

Best Practices of Dashboards Visualization

Introduction to powerful data visualization with Excel Dashboard, important points to consider while designing the dashboards like loading the data, managing data and linking the data to tables and charts, creating Reports using dashboard features.

Principles of Charting

Learning to create charts in Excel, the various charts available, the steps to successfully build a chart, personalization of charts, formatting and updating features, various special charts for Excel dashboards, understanding how to choose the right chart for the right data.

Getting started with Pivot Tables

Creation of Pivot Tables in Excel, learning to change the Pivot Table layout, generating Reports, the methodology of grouping and ungrouping of data.

Creating Dashboards

Learning to create Dashboards, the various rules to follow while creating Dashboards, creation of dynamic dashboards, knowing what is data layout, introduction to thermometer chart and its creation, how to use alerts in the Dashboard setup.

Creation of Interactive Components

How to insert a Scroll bar to a data window?, Concept of Option buttons in a chart, Use of combo box drop-down, List box control Usage, How to use Checkbox Control?

Data Analysis

Understanding data quality issues in Excel, linking of data, consolidating and merging data, working with dashboards for Excel Pivot Tables.

What projects I will be working on this Excel certification training?

Project – if Function

Data – Employee

Problem Statement – It describes about if function and how to implement this if function. It includes following actions:

Calculates Bonus for all employee at 10% of their salary using if Function, Rate the salesman based on the sales and the rating scale., Find the number of times “3” is repeated in the table and find the number of values greater than 5 using Count Function, Uses of Operators and nested if function

Introduction to Data Visualization and Power of Tableau

What is data visualization?, comparison and benefits against reading raw numbers, real use cases from various business domains, some quick and powerful examples using Tableau without going into the technical details of Tableau, installing Tableau, Tableau interface, connecting to DataSource, Tableau data types, and data preparation.

Architecture of Tableau

Installation of Tableau Desktop, architecture of Tableau, interface of Tableau (Layout, Toolbars, Data Pane, Analytics Pane, etc.) how to start with Tableau, and the ways to share and export the work done in Tableau.

Hands-on Exercise: Play with Tableau desktop, learn about the interface, and share and export existing works.

Working with Metadata and Data Blending

Connection to Excel, cubes and PDFs, management of metadata and extracts, data preparation, Joins (Left, Right, Inner, and Outer) and Union, dealing with NULL values, cross-database joining, data extraction, data blending, refresh extraction, incremental extraction, how to build extract , etc.

Hands-on Exercise: Connect to Excel sheet to import data, use metadata and extracts, manage NULL values, clean up data before using, perform the join techniques, execute data blending from multiple sources , etc.

Creation of Sets

Mark, highlight, sort, group, and use sets (creating and editing sets, IN/OUT, sets in hierarchies), constant sets, computed sets, bins, etc.

Hands-on Exercise: Use marks to create and edit sets, highlight the desired items, make groups, apply sorting on results, and make hierarchies among the created sets.

Working with Filters

Filters (addition and removal), filtering continuous dates, dimensions, and measures, interactive filters, marks card, hierarchies, how to create folders in Tableau, sorting in Tableau, types of sorting, filtering in Tableau, types of filters, filtering the order of operations, etc.

Hands-on Exercise: Use the data set by date/dimensions/measures to add filter, use interactive filter to view the data, customize/remove filters to view the result, etc.

Organizing Data and Visual Analytics

Using Formatting Pane to work with menu, fonts, alignments, settings, and copy-paste; formatting data using labels and tooltips, edit axes and annotations, k-means cluster analysis, trend and reference lines, visual analytics in Tableau, forecasting, confidence interval, reference lines, and bands.

Hands-on Exercise: Apply labels and tooltips to graphs, annotations, edit axes’ attributes, set the reference line, and perform k-means cluster analysis on the given dataset.

Working with Mapping

Working on coordinate points, plotting longitude and latitude, editing unrecognized locations, customizing geocoding, polygon maps, WMS: web mapping services, working on the background image, including add image, plotting points on images and generating coordinates from them; map visualization, custom territories, map box, WMS map; how to create map projects in Tableau, creating dual axes maps, and editing locations.

Hands-on Exercise: Plot longitude and latitude on a geo map, edit locations on the geo map, custom geocoding, use images of the map and plot points, find coordinates, create a polygon map, and use WMS.

Working with Calculations and Expressions

Calculation syntax and functions in Tableau, various types of calculations, including Table, String, Date, Aggregate, Logic, and Number; LOD expressions, including concept and syntax; aggregation and replication with LOD expressions, nested LOD expressions; levels of details: fixed level, lower level, and higher level;  quick table calculations, the creation of calculated fields, predefined calculations, and how to validate.

Working with Parameters

Creating parameters, parameters in calculations, using parameters with filters, column selection parameters, chart selection parameters, how to use parameters in the filter session, how to use parameters in calculated fields, how to use parameters in reference line, etc.

Hands-on Exercise: Creating new parameters to apply on a filter, passing parameters to filters to select columns, passing parameters to filters to select charts, etc.

Charts and Graphs

Dual axes graphs, histograms: single and dual axes; box plot; charts: motion, Pareto, funnel, pie, bar, line, bubble, bullet, scatter, and waterfall charts; maps: tree and heat maps; market basket analysis (MBA), using Show me; and text table and highlighted table.

Hands-on Exercise: Plot a histogram, tree map, heat map, funnel chart, and more using the given dataset and also perform market basket analysis (MBA) on the same dataset.

Dashboards and Stories

Building and formatting a dashboard using size, objects, views, filters, and legends; best practices for making creative as well as interactive dashboards using the actions; creating stories, including the intro of story points; creating as well as updating the story points, adding catchy visuals in stories, adding annotations with descriptions; dashboards and stories: what is dashboard?, highlight actions, URL actions, and filter actions, selecting and clearing values, best practices to create dashboards, dashboard examples; using Tableau workspace and Tableau interface; learning about Tableau joins, types of joins; Tableau field types, saving as well as publishing data source, live vs extract connection, and various file types.

Hands-on Exercise: Create a Tableau dashboard view, include legends, objects, and filters, make the dashboard interactive, and use visual effects, annotations, and description s to create and edit a story.

Tableau Prep

Introduction to Tableau Prep, how Tableau Prep helps quickly combine join, shape, and clean data for analysis, creation of smart examples with Tableau Prep, getting deeper insights into the data with great visual experience, making data preparation simpler and accessible, integrating Tableau Prep with Tableau analytical workflow, and understanding the seamless process from data preparation to analysis with Tableau Prep.

Integration of Tableau with R and Hadoop

Introduction to R language, applications and use cases of R, deploying R on the Tableau platform, learning R functions in Tableau, and the integration of Tableau with Hadoop.

Hands-on Exercise: Deploy R on Tableau, create a line graph using R interface, and also connect Tableau with Hadoop to extract data.

What are the projects I will be working on during this Tableau certification training?

Project 1: Working with Tableau Interactive Dashboard

Domain: Sales

Problem Statement: How to make an interactive dashboard with Tableau?

Description:

Upon the completion of this project, you will understand how to create a single point of access for all your sales data, ways of dissecting and analyzing sales from multiple angles, coming up with a sales strategy for improved business revenues.

Project 2: Tableau for Crime Statistics Analysis

Domain: Crime Statistics (Public Domain)

Problem Statement: Showing the types of crimes and their frequency in the District of Columbia and providing the details of the crimes

Description: Police authorities are often called on to “put more feet on the street” to prevent crime and ensure order. However, due to limited resources, it is almost impossible to use them anytime, anywhere. In this project, you will work on the crime data, in the District of Columbia and analyze it using the Tableau tool. During visualization, you can use ‘crime categories’ and ‘days of the week’ as data types to see when and where crimes have occurred. In this project, you will also analyze the details of the crimes, such as, the area/location and the day of the week it has happened, etc. This project will help the local police get insightful information on where to put their crime prevention efforts on.

Highlights:

  • A map should be plotted at the block site address level
  • Shows the offense, the location, and the date of the crime
  • Shows the frequency (in %) for each type of crimes
  • Shows the crime details of every month by week/weekday/offense type (the dashboard should have various filters that will be applicable to all three sheets in the dashboard)
  • An action from map should filter out the other two sheets, accordingly
  • An action from the tree map and the bar chart should highlight the remaining two sheets according to the selection

Project 3: Analyzing Economic Data

Domain: Government

Problem Statement: How is unemployment affecting global malnutrition?

Description: In this Tableau project, you will be working on vast amounts of data and analyzing it to come up with trends, insights, and correlations. Datasets include the global unemployment figures for multiple years, world population statistics across several years, and the worldwide nutritional data. By analyzing this data, you will correlate the malnutrition problem with the unemployment rates using Tableau.

Highlights:

  • Cleaning up Excel data and connecting with Tableau
  • Using data blending and pivot tables
  • Comparative analysis with the Tableau dashboard

Project 4: Analyzing Market Performance

Domain: Retail

Problem Statement: Using the Consumer Packaged Goods data to analyze which are the markets that are performing well for a particular retail enterprise using Tableau Desktop

Description: This Tableau Desktop project involves working with the complex Consumer Packaged Goods data to come up with the brand performance analysis, regions that are contributing good to the revenues, where there is a need to offer more discounts to spur sales, and making in-depth budget vs spend analysis for any particular year.

Highlights:

  • Combining data sources, adding filters, and drilling down data
  • Building an interactive dashboard and reports for detailed analysis
  • Deriving real-time visualization of data for business insights
View More

Free Career Counselling

Certification

This course is designed for clearing the following industry certifications.

  • Tableau Desktop Qualified Associate Exam
  • SAS Certified Base Programmer Exam

Furthermore, you will also be rewarded as “Artificial Intelligence Professional for completing the following learning path that are co-created with IBM:

  • Machine Learning with Python
  • Deep Learning with TensorFlow
  • Build Chatbots with Watson Assistant
  • R for Data Science
  • Python

The complete course is created and delivered in association with IBM to get top jobs in the world’s best organizations. The entire training includes real-world project(s) and case studies that are highly valuable.

Upon the completion of the training, you will have quiz(zes) that will help you prepare for the above-mentioned certification exams and score top marks.

Intellipaat Certification is awarded upon successfully completing the project work(s) and after they are reviewed by experts. Intellipaat certification is recognized in some of the biggest companies like Cisco, Cognizant, Mu Sigma, TCS, Genpact, Hexaware, Sony and Ericsson, among others.

Our Alumni works at top 3000+ companies

client-desktop client-mobile

Course Advisor

Suresh Paritala

Suresh Paritala

Solutions Architect at Microsoft, USA

A Senior Software Architect at NextGen Healthcare who has previously worked with IBM Corporation, Suresh Paritala has worked on Big Data, Data Science, Advanced Analytics, Internet of Things and Azure, along with AI domains like Machine Learning and Deep Learning. He has successfully implemented high-impact.

David Callaghan

David Callaghan

Big Data Solutions Architect, USA

An experienced Blockchain Professional who has been bringing integrated Blockchain, particularly Hyperledger and Ethereum, and Big Data solutions to the cloud, David Callaghan has previously worked on Hadoop, AWS Cloud, Big Data and Pentaho projects that have had major impact on revenues of marquee brands around the world.

Samanth Reddy

Samanth Reddy

Data Team Lead at Sony, USA

A renowned Data Scientist who has worked with Google and is currently working at ASCAP, Samanth Reddy has a proven ability to develop Data Science strategies that have a high impact on the revenues of various organizations. He comes with strong Data Science expertise and has created decisive Data Science strategies for Fortune 500 corporations.

Frequently Asked Questions

Why Should I Learn Artificial Intelligence Engineer Master's Course from Intellipaat?

Intellipaat provides the best Artificial Intelligence Engineer training that gives you all the skills needed to work in the domains of AI, Machine Learning, Deep Learning, Data Science with R Statistical computing and Python to give the professionals an added advantage. Upon the completion of the training, you will be awarded the Intellipaat Artificial Intelligence Engineer certification.

You will be working on real-time projects and step-by-step assignments that have high relevance in the corporate world, and the curriculum is designed by industry experts. Upon the completion of the training course, you can apply for some of the best jobs in top MNCs around the world at top salaries. Intellipaat offers lifetime access to videos, course materials, 24/7 support and course material upgrading to the latest version at no extra fee. Hence, it is clearly a one-time investment.

Intellipaat provides the best Artificial Intelligence Engineer training that gives you all the skills needed to work in the domains of AI, machine learning, deep learning, Data Science with R statistical computing, Python to give the professionalsan added advantage. Upon completion of the training you will be awarded the Intellipaat Artificial Intelligence Engineer certification.

You will be working on real time projects that have high relevance in the corporate world, step by step assignments and curriculum designed by industry experts. Upon completion of the training course you can apply for some of the best jobs in top MNCs around the world at top salaries. Intellipaat offers lifetime access to videos, course materials, 24/7 Support, and course material upgrading to latest version at no extra fees. Hence it is clearly a one-time investment.

Intellipaat basically offers the self-paced training and online instructor-led training. Apart from that we also provide corporate training for enterprises. All our trainers come with over 12 years of industry experience in relevant technologies and also they are subject matter experts working as consultants. You can check about the quality of our trainers in the sample videos provided.

If you have any queries you can contact our 24/7 dedicated support to raise a ticket. We provide you email support and solution to your queries. If the query is not resolved by email we can arrange for a one-on-one session with our trainers. The best part is that you can contact Intellipaat even after completion of training to get support and assistance. There is also no limit on the number of queries you can raise when it comes to doubt clearance and query resolution.

The Intellipaat self-paced training is for people who want to learn at their own leisurely pace. As part of this program we provide you with one-on-one sessions, doubt clearance over email, 24/7 Live Support, lifetime LMS and upgrade to the latest version at no extra cost. The prices of self-paced training can be 75% lesser than online training. While studying should you face any unexpected challenges then we shall arrange a Virtual LIVE session with the trainer.

We provide you with the opportunity to work on real world projects wherein you can apply your knowledge and skills that you acquired through our training. We have multiple projects that thoroughly test your skills and knowledge of various aspect and components making you perfectly industry-ready. These projects could be in exciting and challenging fields like banking, insurance, retail, social networking, ecommerce, marketing, sales, high technology and so on. The Intellipaat projects are equivalent to six months of relevant experience in the corporate world.

Yes, Intellipaat does provide you with placement assistance. We have tie-ups with 80+ organizations including Ericsson, Cisco, Cognizant, TCS, among others that are looking for skilled & quality professionals and we would be happy to assist you with the process of preparing yourself for the interview and the job.

Yes, if you would want to upgrade from the self-paced training to instructor-led training then you can easily do so by paying the difference of the fees amount and joining the next batch of classes which shall be separately notified to you.

Upon successful completion of training you have to take a set of quizzes, complete the projects and upon review and on scoring over 60% marks in the qualifying quiz the official Intellipaat verified certificate is awarded.The Intellipaat Certification is a seal of approval and is highly recognized in 80+ corporations around the world including many in the Fortune 500 list of companies.

At Intellipaat you can enroll either for the instructor-led online training or self-paced training. Apart from this Intellipaat also offers corporate training for organizations to upskill their workforce. All trainers at Intellipaat have 12+ years of relevant industry experience and they have been actively working as consultants in the same domain making them subject matter experts. Go through the sample videos to check the quality of the trainers.
Intellipaat is offering the 24/7 query resolution and you can raise a ticket with the dedicated support team anytime. You can avail the email support for all your queries. In the event of your query not getting resolved through email we can also arrange one-to-one sessions with the trainers. You would be glad to know that you can contact Intellipaat support even after completion of the training. We also do not put a limit on the number of tickets you can raise when it comes to query resolution and doubt clearance.
Intellipaat offers the self-paced training to those who want to learn at their own pace. This training also affords you the benefit of query resolution through email, one-on-one sessions with trainers, round the clock support and access to the learning modules or LMS for lifetime. Also you get the latest version of the course material at no added cost. The Intellipaat self-paced training is 75% lesser priced compared to the online instructor-led training. If you face any problems while learning we can always arrange a virtual live class with the trainers as well.
Intellipaat is offering you the most updated, relevant and high value real-world projects as part of the training program. This way you can implement the learning that you have acquired in a real-world industry setup. All training comes with multiple projects that thoroughly test your skills, learning and practical knowledge thus making you completely industry-ready. You will work on highly exciting projects in the domains of high technology, ecommerce, marketing, sales, networking, banking, insurance, etc. Upon successful completion of the projects your skills will be considered equal to six months of rigorous industry experience.
Intellipaat actively provides placement assistance to all learners who have successfully completed the training. For this we are exclusively tied-up with over 80 top MNCs from around the world. This way you can be placed in outstanding organizations like Sony, Ericsson, TCS, Mu Sigma, Standard Chartered, Cognizant, Cisco, among other equally great enterprises. We also help you with the job interview and résumé preparation part as well.
You can definitely make the switch from self-paced to online instructor-led training by simply paying the extra amount and joining the next batch of the training which shall be notified to you specifically.
Once you complete the Intellipaat training program along with all the real-world projects, quizzes and assignments and upon scoring at least 60% marks in the qualifying exam; you will be awarded the Intellipaat verified certification. This certificate is very well recognized in Intellipaat affiliate organizations which include over 80 top MNCs from around the world which are also part of the Fortune 500 list of companies.
Apparently, No. Our Job Assistance program is aimed at helping you land in your dream job. It offers a potential opportunity for you to explore various competitive openings in the corporate world and assists you in finding a well-paid job, matching your profile. The final decision on your hiring will always be based on your performance in the interview and the requirements of the recruiter.
View More

Talk to us

Select Currency