Intellipaat’s Data Analytics certification online course includes Data Science with R, Tableau, SAS, MS Excel and Qlik Sense courses. Through this Data Analytics course you will master Data Analytics Lifecycle, deploy statistical analysis, generate BI reports and extract business insights by working on real world projects.
Online Classroom Training
Self Paced Training
Intellipaat Data Analytics Training has been created to help you master the domain of Data Analytics. In this online Data Analytics classes, you will learn about Data Science with R, Tableau, Qlik Sense and SAS analytics. As part of this Data Analytics online training you will master topics like data mining, data visualization, statistical analysis, Tableau integration with R, regression modeling and more through hands-on projects and case studies.
Online instructor-led courses
In this Data Analytics online course, you will learn about
Data Analyst is among the most sought-after career options in today’s technologically advanced world. There are numerous job opportunities available in this domain which is one of the main reasons why you can opt for this career option.
As per IBM, jobs in this domain will rise by 15% by 2020, leading to the creation of more than 2.72 million jobs for Data Analytics professionals.
There are no prerequisites for taking online Data Analytics course. A basic knowledge of data analysis, statistics and probability is beneficial to take Data Analytics online courses.
According to Glassdoor, the average income of a Data Analyst in the United States is about US$62,453 per annum. This may increase to US$95,000 per annum with more experience and better work quality.
In India, the average salary of these professionals is approximately ₹503,000 per annum and with more experience, it can rise to ₹1,005,000 per annum.
Here are a couple of differences between Data Scientists, Data Analysts, and Business Analysts:
Today data analytics is one of the top domains since we are living in a data-driven world. If you want to get ahead in your career, then you need to learn data analytics as it is being deployed in every organization regardless of the industry vertical. The Intellipaat Data Analytics Certification Courses have been created to give you an edge in this data-driven world. Through this Data Analytics training, you will work on real world data analytics projects and case studies so that you can get a hands-on experience of this domain. Upon completion of this Data Analytics online course you can apply for the best jobs in the data analytics domain and command top-notch salaries.
1.1 What is Data Science?
1.2 Significance of Data Science in today’s data-driven world, applications of Data Science, lifecycle of Data Science, and its components
1.3 Introduction to Big Data Hadoop, Machine Learning, and Deep Learning
1.4 Introduction to R programming and RStudio
1. Installation of RStudio
2. Implementing simple mathematical operations and logic using R operators, loops, if statements, and switch cases
2.1 Introduction to data exploration
2.2 Importing and exporting data to/from external sources
2.3 What are data exploratory analysis and data importing?
2.4 DataFrames, working with them, accessing individual elements, vectors, factors, operators, in-built functions, conditional and looping statements, user-defined functions, and data types
1. Accessing individual elements of customer churn data
2. Modifying and extracting results from the dataset using user-defined functions in R
3.1 Need for data manipulation
3.2 Introduction to the dplyr package
3.3 Selecting one or more columns with select(), filtering records on the basis of a condition with filter(), adding new columns with mutate(), sampling, and counting
3.4 Combining different functions with the pipe operator and implementing SQL-like operations with sqldf
1. Implementing dplyr
2. Performing various operations for manipulating data and storing it
4.1 Introduction to visualization
4.2 Different types of graphs, the grammar of graphics, the ggplot2 package, categorical distribution with geom_bar(), numerical distribution with geom_hist(), building frequency polygons with geom_freqpoly(), and making a scatterplot with geom_pont()
4.3 Multivariate analysis with geom_boxplot
4.4 Univariate analysis with a barplot, a histogram and a density plot, and multivariate distribution
4.5 Creating barplots for categorical variables using geom_bar(), and adding themes with the theme() layer
4.6 Visualization with plotly, frequency plots with geom_freqpoly(), multivariate distribution with scatter plots and smooth lines, continuous distribution vs categorical distribution with box-plots, and sub grouping plots
4.7 Working with co-ordinates and themes to make graphs more presentable, understanding plotly and various plots, and visualization with ggvis
4.8 Geographic visualization with ggmap() and building web applications with shinyR
1. Creating data visualization to understand the customer churn ratio using ggplot2 charts
2. Using plotly for importing and analyzing data
3. Visualizing tenure, monthly charges, total charges, and other individual columns using a scatter plot
5.1 Why do we need statistics?
5.2 Categories of statistics, statistical terminology, types of data, measures of central tendency, and measures of spread
5.3 Correlation and covariance, standardization and normalization, probability and the types, hypothesis testing, chi-square testing, ANOVA, normal distribution, and binary distribution
1. Building a statistical analysis model that uses quantification, representations, and experimental data
2. Reviewing, analyzing, and drawing conclusions from the data
6.1 Introduction to Machine Learning
6.2 Introduction to linear regression, predictive modeling, simple linear regression vs multiple linear regression, concepts, formulas, assumptions, and residuals in Linear Regression, and building a simple linear model
6.3 Predicting results and finding the p-value and an introduction to logistic regression
6.4 Comparing linear regression with logistics regression and bivariate logistic regression with multivariate logistic regression
6.5 Confusion matrix the accuracy of a model, understanding the fit of the model, threshold evaluation with ROCR, and using qqnorm() and qqline()
6.6 Understanding the summary results with null hypothesis, F-statistic, and
building linear models with multiple independent variables
1. Modeling the relationship within data using linear predictor functions
2. Implementing linear and logistics regression in R by building a model with ‘tenure’ as the dependent variable
7.1 Introduction to logistic regression
7.2 Logistic regression concepts, linear vs logistic regression, and math behind logistic regression
7.3 Detailed formulas, logit function and odds, bivariate logistic regression, and Poisson regression
7.4 Building a simple binomial model and predicting the result, making a confusion matrix for evaluating the accuracy, true positive rate, false positive rate, and threshold evaluation with ROCR
7.5 Finding out the right threshold by building the ROC plot, cross validation, multivariate logistic regression, and building logistic models with multiple independent variables
7.6 Real-life applications of logistic regression
1. Implementing predictive analytics by describing data
2. Explaining the relationship between one dependent binary variable and one or more binary variables
3. Using glm() to build a model, with ‘Churn’ as the dependent variable
8.1 What is classification? Different classification techniques
8.2 Introduction to decision trees
8.3 Algorithm for decision tree induction and building a decision tree in R
8.4 Confusion matrix and regression trees vs classification trees
8.5 Introduction to bagging
8.6 Random forest and implementing it in R
8.7 What is Naive Bayes? Computing probabilities
8.8 Understanding the concepts of Impurity function, Entropy, Gini index, and Information gain for the right split of node
8.9 Overfitting, pruning, pre-pruning, post-pruning, and cost-complexity pruning, pruning a decision tree and predicting values, finding out the right number of trees, and evaluating performance metrics
1. Implementing random forest for both regression and classification problems
2. Building a tree, pruning it using ‘churn’ as the dependent variable, and building a random forest with the right number of trees
3. Using ROCR for performance metrics
9.1 What is Clustering? Its use cases
9.2 what is k-means clustering? What is canopy clustering?
9.3 What is hierarchical clustering?
9.4 Introduction to unsupervised learning
9.5 Feature extraction, clustering algorithms, and the k-means clustering algorithm
9.6 Theoretical aspects of k-means, k-means process flow, k-means in R, implementing k-means, and finding out the right number of clusters using a scree plot
9.7 Dendograms, understanding hierarchical clustering, and implementing it in R
9.8 Explanation of Principal Component Analysis (PCA) in detail and implementing PCA in R
1. Deploying unsupervised learning with R to achieve clustering and dimensionality reduction
2. K-means clustering for visualizing and interpreting results for the customer churn data
10.1 Introduction to association rule mining and MBA
10.2 Measures of association rule mining: Support, confidence, lift, and apriori algorithm, and implementing them in R
10.3 Introduction to recommendation engines
10.4 User-based collaborative filtering and item-based collaborative filtering, and implementing a recommendation engine in R
10.5 Recommendation engine use cases
1. Deploying association analysis as a rule-based Machine Learning method
2. Identifying strong rules discovered in databases with measures based on interesting discoveries
11.1 Introducing Artificial Intelligence and Deep Learning
11.2 What is an artificial neural network? TensorFlow: The computational framework for building AI models
11.3 Fundamentals of building ANN using TensorFlow and working with TensorFlow in R
12.1 What is a time series? The techniques, applications, and components of time series
12.2 Moving average, smoothing techniques, and exponential smoothing
12.3 Univariate time series models and multivariate time series analysis
12.4 ARIMA model
12.5 Time series in R, sentiment analysis in R (Twitter sentiment analysis), and text analysis
1. Analyzing time series data
2. Analyzing the sequence of measurements that follow a non-random order to identify the nature of phenomenon and forecast the future values in the series
13.1 Introduction to Support Vector Machine (SVM)
13.2 Data classification using SVM
13.3 SVM algorithms using separable and inseparable cases
13.4 Linear SVM for identifying margin hyperplane
14.1 What is the Bayes theorem?
14.2 What is Naïve Bayes Classifier?
14.3 Classification Workflow
14.4 How Naive Bayes classifier works and classifier building in Scikit-Learn
14.5 Building a probabilistic classification model using Naïve Bayes and the zero probability problem
15.1 Introduction to the concepts of text mining
15.2 Text mining use cases and understanding and manipulating the text with ‘tm’ and ‘stringR’
15.3 Text mining algorithms and the quantification of the text
15.4 TF-IDF and after TF-IDF
Case Study 01: Market Basket Analysis (MBA)
1.1 This case study is associated with the modeling technique of Market Basket Analysis, where you will learn about loading data, plotting items, and running algorithms.
1.2 It includes finding out the items that go hand in hand and can be clubbed together.
1.3 This is used for various real-world scenarios like a supermarket shopping cart and so on.
Case Study 02: Logistic Regression
2.1 In this case study, you will get a detailed understanding of the advertisement spends of a company that will help drive more sales.
2.2 You will deploy logistic regression to forecast future trends.
2.3 You will detect patterns and uncover insight using the power of R programming.
2.4 Due to this, the future advertisement spends can be decided and optimized for higher revenues.
Case Study 03: Multiple Regression
3.1 You will understand how to compare the miles per gallon (MPG) of a car based on various parameters.
3.2 You will deploy multiple regression and note down the MPG for car make, model, speed, load conditions, etc.
3.3 The case study includes model building, model diagnostic, and checking the ROC curve, among other things.
Case Study 04: Receiver Operating Characteristic (ROC)
4.1 In this case study, you will work with various datasets in R.
4.2 You will deploy data exploration methodologies.
4.3 You will also build scalable models.
4.4 Besides, you will predict the outcome with highest precision, diagnose the model that you have created with real-world data, and check the ROC curve.
Project 01: Market Basket Analysis
Domain: Inventory Management
Problem Statement: As a new manager in the company, you are assigned the task of increasing cross selling
Topics: Association rule mining, data extraction, and data manipulation
Project 02: Credit Card Fraud Detection
Problem Statement: Analyze the probability of being involved in a fraudulent operation
Topics: Algorithms, V17 predictor, data visualization, and R
Project 03: Data Cleaning Using the Census Dataset
Problem Statement: Perform data cleansing on the raw dataset
Topics: Data analysis, data preprocessing, cleaning ops, data visualization, and R
Project 04: Loan Approval Prediction
Problem Statement: Predict the approval rate of a loan by using multiple labels
Topics: Data analysis, data preprocessing, cleaning ops, data visualization, and R
Project 05: Designing a Book Recommendation System
Problem Statement: Create a model, which can recommend books, based on user interest
Topics: Data cleaning, data visualization, and user-based collaborative filtering
Project 06: Netflix Recommendation System
Problem Statement: Simulate the Netflix recommendation system
Topics: Data cleaning, data visualization, distribution, and Recommender Lab
Project 07: Creating a Pokemon Game Using Machine Learning
Problem Statement: Create a game engine for Pokemon using Machine Learning
Topics: Decision trees, regression, data cleaning, and data visualization
Case Study 01: Introduction to R Programming
Problem Statement: Working with various operators in R
Topics: Arithmetic operators, relational operators, and logical operators
Case Study 02: Solving Customer Churn Using Data Exploration
Problem Statement: Understanding what to do to reduce customer churn using data exploration
Topics: Data Exploration
Case Study 03: Creating Data Structures in R
Problem Statement: Implementing various data structures in R for various scenarios
Topics: Vectors, lists, matrices, and arrays
Case Study 04: Implementing SVD in R
Problem Statement: Understanding the use of single value decomposition in R by making use of the MovieLense dataset
Topics: 5-fold cross validation and realRatingMatrix
Case Study 05: Time Series Analysis
Problem Statement: Performing TSA and understanding the concepts of ARIMA for a given scenario
Topics: Time series analysis, R language, data visualization, and the ARIMA model
What is data visualization?, comparison and benefits against reading raw numbers, real use cases from various business domains, some quick and powerful examples using Tableau without going into the technical details of Tableau, installing Tableau, Tableau interface, connecting to DataSource, Tableau data types, and data preparation.
Installation of Tableau Desktop, architecture of Tableau, interface of Tableau (Layout, Toolbars, Data Pane, Analytics Pane, etc.) how to start with Tableau, and the ways to share and export the work done in Tableau.
Hands-on Exercise: Play with Tableau desktop, learn about the interface, and share and export existing works.
Connection to Excel, cubes and PDFs, management of metadata and extracts, data preparation, Joins (Left, Right, Inner, and Outer) and Union, dealing with NULL values, cross-database joining, data extraction, data blending, refresh extraction, incremental extraction, how to build extract , etc.
Hands-on Exercise: Connect to Excel sheet to import data, use metadata and extracts, manage NULL values, clean up data before using, perform the join techniques, execute data blending from multiple sources , etc.
Mark, highlight, sort, group, and use sets (creating and editing sets, IN/OUT, sets in hierarchies), constant sets, computed sets, bins, etc.
Hands-on Exercise: Use marks to create and edit sets, highlight the desired items, make groups, apply sorting on results, and make hierarchies among the created sets.
Filters (addition and removal), filtering continuous dates, dimensions, and measures, interactive filters, marks card, hierarchies, how to create folders in Tableau, sorting in Tableau, types of sorting, filtering in Tableau, types of filters, filtering the order of operations, etc.
Hands-on Exercise: Use the data set by date/dimensions/measures to add filter, use interactive filter to view the data, customize/remove filters to view the result, etc.
Using Formatting Pane to work with menu, fonts, alignments, settings, and copy-paste; formatting data using labels and tooltips, edit axes and annotations, k-means cluster analysis, trend and reference lines, visual analytics in Tableau, forecasting, confidence interval, reference lines, and bands.
Hands-on Exercise: Apply labels and tooltips to graphs, annotations, edit axes’ attributes, set the reference line, and perform k-means cluster analysis on the given dataset.
Working on coordinate points, plotting longitude and latitude, editing unrecognized locations, customizing geocoding, polygon maps, WMS: web mapping services, working on the background image, including add image, plotting points on images and generating coordinates from them; map visualization, custom territories, map box, WMS map; how to create map projects in Tableau, creating dual axes maps, and editing locations.
Hands-on Exercise: Plot longitude and latitude on a geo map, edit locations on the geo map, custom geocoding, use images of the map and plot points, find coordinates, create a polygon map, and use WMS.
Calculation syntax and functions in Tableau, various types of calculations, including Table, String, Date, Aggregate, Logic, and Number; LOD expressions, including concept and syntax; aggregation and replication with LOD expressions, nested LOD expressions; levels of details: fixed level, lower level, and higher level; quick table calculations, the creation of calculated fields, predefined calculations, and how to validate.
Creating parameters, parameters in calculations, using parameters with filters, column selection parameters, chart selection parameters, how to use parameters in the filter session, how to use parameters in calculated fields, how to use parameters in reference line, etc.
Hands-on Exercise: Creating new parameters to apply on a filter, passing parameters to filters to select columns, passing parameters to filters to select charts, etc.
Dual axes graphs, histograms: single and dual axes; box plot; charts: motion, Pareto, funnel, pie, bar, line, bubble, bullet, scatter, and waterfall charts; maps: tree and heat maps; market basket analysis (MBA), using Show me; and text table and highlighted table.
Hands-on Exercise: Plot a histogram, tree map, heat map, funnel chart, and more using the given dataset and also perform market basket analysis (MBA) on the same dataset.
Building and formatting a dashboard using size, objects, views, filters, and legends; best practices for making creative as well as interactive dashboards using the actions; creating stories, including the intro of story points; creating as well as updating the story points, adding catchy visuals in stories, adding annotations with descriptions; dashboards and stories: what is dashboard?, highlight actions, URL actions, and filter actions, selecting and clearing values, best practices to create dashboards, dashboard examples; using Tableau workspace and Tableau interface; learning about Tableau joins, types of joins; Tableau field types, saving as well as publishing data source, live vs extract connection, and various file types.
Hands-on Exercise: Create a Tableau dashboard view, include legends, objects, and filters, make the dashboard interactive, and use visual effects, annotations, and description s to create and edit a story.
Introduction to Tableau Prep, how Tableau Prep helps quickly combine join, shape, and clean data for analysis, creation of smart examples with Tableau Prep, getting deeper insights into the data with great visual experience, making data preparation simpler and accessible, integrating Tableau Prep with Tableau analytical workflow, and understanding the seamless process from data preparation to analysis with Tableau Prep.
Introduction to R language, applications and use cases of R, deploying R on the Tableau platform, learning R functions in Tableau, and the integration of Tableau with Hadoop.
Hands-on Exercise: Deploy R on Tableau, create a line graph using R interface, and also connect Tableau with Hadoop to extract data.
Project 1: Working with Tableau Interactive Dashboard
Problem Statement: How to make an interactive dashboard with Tableau?
Upon the completion of this project, you will understand how to create a single point of access for all your sales data, ways of dissecting and analyzing sales from multiple angles, coming up with a sales strategy for improved business revenues.
Project 2: Tableau for Crime Statistics Analysis
Domain: Crime Statistics (Public Domain)
Problem Statement: Showing the types of crimes and their frequency in the District of Columbia and providing the details of the crimes
Description: Police authorities are often called on to “put more feet on the street” to prevent crime and ensure order. However, due to limited resources, it is almost impossible to use them anytime, anywhere. In this project, you will work on the crime data, in the District of Columbia and analyze it using the Tableau tool. During visualization, you can use ‘crime categories’ and ‘days of the week’ as data types to see when and where crimes have occurred. In this project, you will also analyze the details of the crimes, such as, the area/location and the day of the week it has happened, etc. This project will help the local police get insightful information on where to put their crime prevention efforts on.
Project 3: Analyzing Economic Data
Problem Statement: How is unemployment affecting global malnutrition?
Description: In this Tableau project, you will be working on vast amounts of data and analyzing it to come up with trends, insights, and correlations. Datasets include the global unemployment figures for multiple years, world population statistics across several years, and the worldwide nutritional data. By analyzing this data, you will correlate the malnutrition problem with the unemployment rates using Tableau.
Project 4: Analyzing Market Performance
Problem Statement: Using the Consumer Packaged Goods data to analyze which are the markets that are performing well for a particular retail enterprise using Tableau Desktop
Description: This Tableau Desktop project involves working with the complex Consumer Packaged Goods data to come up with the brand performance analysis, regions that are contributing good to the revenues, where there is a need to offer more discounts to spur sales, and making in-depth budget vs spend analysis for any particular year.
Various types of databases, introduction to Structured Query Language, distinction between client server and file server databases, understanding SQL Server Management Studio, SQL Table basics, data types and functions, Transaction-SQL, authentication for Windows, data control language, and the identification of the keywords in T-SQL, such as Drop Table.
Data Anomalies, Update Anomalies, Insertion Anomalies, Deletion Anomalies, Types of Dependencies, Functional Dependency, Fully functional dependency, Partial functional dependency, Transitive functional dependency, Multi-valued functional dependency, Decomposition of tables, Lossy decomposition, Lossless decomposition, What is Normalization?, First Normal Form, Second Normal Form, Third Normal Form, Boyce-Codd Normal Form(BCNF), Fourth Normal Form, Entity-Relationship Model, Entity and Entity Set, Attributes and types of Attributes, Entity Sets, Relationship Sets, Degree of Relationship, Mapping Cardinalities, One-to-One, One-to-Many, Many-to-one, Many-to-many, Symbols used in E-R Notation.
Introduction to relational databases, fundamental concepts of relational rows, tables, and columns; several operators (such as logical and relational), constraints, domains, indexes, stored procedures, primary and foreign keys, understanding group functions, the unique key, etc.
Advanced concepts of SQL tables, SQL functions, operators & queries, table creation, data retrieval from tables, combining rows from tables using inner, outer, cross, and self joins, deploying operators such as ‘intersect,’ ‘except,’ ‘union,’ temporary table creation, set operator rules, table variables, etc.
Understanding SQL functions – what do they do?, scalar functions, aggregate functions, functions that can be used on different datasets, such as numbers, characters, strings, and dates, inline SQL functions, general functions, and duplicate functions.
Understanding SQL subqueries, their rules; statements and operators with which subqueries can be used, using the set clause to modify subqueries, understanding different types of subqueries, such as where, select, insert, update, delete, etc., and methods to create and view subqueries.
Learning SQL views, methods of creating, using, altering, renaming, dropping, and modifying views; understanding stored procedures and their key benefits, working with stored procedures, studying user-defined functions, and error handling.
User-defined functions; types of UDFs, such as scalar, inline table value, multi-statement table, stored procedures and when to deploy them, what is rank function?, triggers, and when to execute triggers?
SQL Server Management Studio, using pivot in MS Excel and MS SQL Server, differentiating between Char, Varchar, and NVarchar, XL path, indexes and their creation, records grouping, advantages, searching, sorting, modifying data; clustered indexes creation, use of indexes to cover queries, common table expressions, and index guidelines.
Creating Transact-SQL queries, querying multiple tables using joins, implementing functions and aggregating data, modifying data, determining the results of DDL statements on supplied tables and data, and constructing DML statements using the output statement.
Querying data using subqueries and APPLY, querying data using table expressions, grouping and pivoting data using queries, querying temporal data and non-relational data, constructing recursive table expressions to meet business requirements, and using windowing functions to group and rank the results of a query.
Creating database programmability objects by using T-SQL, implementing error handling and transactions, implementing transaction control in conjunction with error handling in stored procedures, and implementing data types and NULL.
Designing and implementing relational database schema; designing and implementing indexes, learning to compare between indexed and included columns, implementing clustered index, and designing and deploying views and column store views.
Explaining foreign key constraints, using T-SQL statements, usage of Data Manipulation Language (DML), designing the components of stored procedures, implementing input and output parameters, applying error handling, executing control logic in stored procedures, and designing trigger logic, DDL triggers, etc.
Applying transactions, using the transaction behavior to identify DML statements, learning about implicit and explicit transactions, isolation levels management, understanding concurrency and locking behavior, and using memory-optimized tables.
Accuracy of statistics, formulating statistics maintenance tasks, dynamic management objects management, identifying missing indexes, examining and troubleshooting query plans, consolidating the overlapping indexes, the performance management of database instances, and SQL server performance monitoring.
Corelated Subquery, Grouping Sets, Rollup, Cube
Implementing Corelated Subqueries, Using EXISTS with a Correlated subquery, Using Union Query, Using Grouping Set Query, Using Rollup, Using CUBE to generate four grouping sets, Perform a partial CUBE.
Project 1: Writing Complex Subqueries
Problem Statement: How to create subqueries using SQL?
Topics: This project will give you hands-on experience in working with SQL subqueries and utilizing them in various scenarios. Some of the subqueries that you will be working with and gaining hands-on experience in are: IN or NOT IN, ANY or ALL, EXISTS or NOT EXISTS, and other major queries.
Project 2: Querying a Large Relational Database
Problem Statement: How to get details about customers by querying the database?
Topics: In this project, you will work on downloading a database and restoring it on the server. You will then query the database to get customer details like name, phone number, email ID, sales made in a particular month, increase in month-on-month sales, and even the total sales made to a particular customer.
Project 3: Relational Database Design
Problem Statement: How to convert a relational design into a table in SQL Server?
Topics: In this project, you will work on converting a relational design that has enlisted within it various users, user roles, user accounts, and their statuses. You will create a table in SQL Server and insert data into it. With at least two rows in each of the tables, you will ensure that you have created respective foreign keys.
Installation and introduction to SAS, how to get started with SAS, understanding different SAS windows, how to work with data sets, various SAS windows like output, search, editor, log and explorer and understanding the SAS functions, which are various library types and programming files
How to import and export raw data files, how to read and subset the data sets, different statements like SET, MERGE and WHERE
Hands-on Exercise: How to import the Excel file in the workspace and how to read data and export the workspace to save data
Different SAS operators like logical, comparison and arithmetic, deploying different SAS functions like Character, Numeric, Is Null, Contains, Like and Input/Output, along with the conditional statements like If/Else, Do While, Do Until and so on
Hands-on Exercise: Performing operations using the SAS functions and logical and arithmetic operations
Understanding about input buffer, PDV (backend) and learning what is Missover
Defining and using KEEP and DROP statements, apply these statements and formats and labels in SAS
Hands-on Exercise: Use KEEP and DROP statements
Understanding the delimiter, dataline rules, DLM, delimiter DSD, raw data files and execution and list input for standard data
Hands-on Exercise: Use delimiter rules on raw data files
Various SAS standard procedures built-in for popular programs: PROC SORT, PROC FREQ, PROC SUMMARY, PROC RANK, PROC EXPORT, PROC DATASET, PROC TRANSPOSE, PROC CORR, etc.
Hands-on Exercise: Use SORT, FREQ, SUMMARY, EXPORT and other procedures
Reading standard and non-standard numeric inputs with formatted inputs, column pointer controls, controlling while a record loads, line pointer control/absolute line pointer control, single trailing, multiple IN and OUT statements, dataline statement and rules, list input method and comparing single trailing and double trailing
Hands-on Exercise: Read standard and non-standard numeric inputs with formatted inputs, control while a record loads, control a line pointer and write multiple IN and OUT statements
SAS Format statements: standard and user-written, associating a format with a variable, working with SAS Format, deploying it on PROC data sets and comparing ATTRIB and Format statements
Hands-on Exercise: Format a variable, deploy format rule on PROC data set and use ATTRIB statement
Understanding PROC GCHART, various graphs, bar charts: pie, bar and 3D and plotting variables with PROC GPLOT
Hands-on Exercise: Plot graphs using PROC GPLOT and display charts using PROC GCHART
SAS advanced data discovery and visualization, point-and-click analytics capabilities and powerful reporting tools
Character functions, numeric functions and converting variable type
Hands-on Exercise: Use functions in data transformation
Introduction to ODS, data optimization and how to generate files (rtf, pdf, html and doc) using SAS
Hands-on Exercise: Optimize data and generate rtf, pdf, html and doc files
Macro Syntax, macro variables, positional parameters in a macro and macro step
Hands-on Exercise: Write a macro and use positional parameters
SQL statements in SAS, SELECT, CASE, JOIN and UNION and sorting data
Hands-on Exercise: Create SQL query to select and add a condition and use a CASE in select query
Base SAS web-based interface and ready-to-use programs, advanced data manipulation, storage and retrieval and descriptive statistics
Hands-on Exercise: Use web UI to do statistical operations
Report enhancement, global statements, user-defined formats, PROC SORT, ODS destinations, ODS listing, PROC FREQ, PROC Means, PROC UNIVARIATE, PROC REPORT and PROC PRINT
Hands-on Exercise: Use PROC SORT to sort the results, list ODS, find mean using PROC Means and print using PROC PRINT
Project 1: Categorization of Patients Based on the Count of Drugs for Their Therapy
Objective: This project aims to find out descriptive statistics and subset for specific clinical data problems. It will give them brief insight about Base SAS procedures and data steps.
Count the number of patients,
Output should be four datasets
Project 2: Build Revenue Projections Reports
Objective: This project will give you hands-on experience in working with the SAS data analytics and business intelligence tool. You will be working on the data entered in a business enterprise setup and will aggregate, retrieve and manage that data. You will learn to create insightful reports and graphs and come up with statistical and mathematical analysis to scientifically predict the revenue projection for a particular future time frame. Upon the completion of the project, you will be well-versed in the practical aspects of data analytics, predictive modeling and data mining.
Project 3: Impact of Pre-paid Plans on the Preferences of Investors
Domain: Finance Market
Objective: The project aims to find the most impacting factors in preferences of pre-paid model; it also identifies which all are the variables highly correlated with impacting factors.
Project 4:K-Means Cluster Analysis on Iris Dataset
Objective: K-Means cluster analysis on Iris dataset to predict about the class of a flower using its petal’s dimensions
How does Qlik Sense vary from QlikView, the need for self-service Business Intelligence/Business Analytics tools, Qlik Sense data discovery, intuitive tool for dynamic dashboards and personalized reports and the installation of Qlik Sense and Qlik Sense Desktop
Hands-on Exercise: Install Qlik Sense and Qlik Sense Desktop
Drag-and-drop visualization, Qlik Data indexing engine, data dimensions relationships, connect to multiple data sources, creating your own dashboards, data visualization, visual analytics and the ease of collaboration
Hands-on Exercise: Connect to a database or load data from an Excel file and create a dashboard
Understand data modeling, best practices, turning data columns into rows, converting data rows into fields, hierarchical-level data loading, loading new or updated data from database, using a common field to combine data from two tables and handling data inconsistencies
Hands-on Exercise: Turn data columns into rows, convert data rows into fields, load the data in hierarchical level, load new or updated data from database and use a common field to combine data from two tables
Qlik Sense data architecture, understanding QVD layer, converting QlikView files to Qlik Sense files and working on synthetic keys and circular references
Hands-on Exercise: Convert QlikView files to Qlik Sense files and resolve synthetic keys and circular references
Qlik Sense star schema, link table, dimensions table, master calendar, QVD files and optimizing data modeling
Hands-on Exercise: Create a Qlik Sense star schema, create link table, dimensions table, master calendar and QVD files
Qlik Sense enterprise class tools, Qlik Sense custom app, embedding visuals, rapid development, powerful open APIs, enterprise-class architecture, Big Data integration, enterprise security and elastic scaling
Learning about Qlik Sense visualization tools, charts and maps creation, rich data storytelling and sharing analysis visually with compelling visualizations
Hands-on Exercise: Create charts and maps, create a story around dataset and share analysis
Understanding set analysis in Qlik Sense, various parts of a set expression like identifiers, operators, modifiers and comparative analysis
Hands-on Exercise: Do Set Analysis in Qlik Sense, use set expression like identifiers, operators, modifiers and comparative analysis
Learning about set analysis which is a way of defining a set of data values different from normal set, deploying comparison sets and point-in-time analysis
Hands-on Exercise: Deploy comparison sets and perform point-in-time analysis
Introduction to various charts in Qlik Sense like line chart, bar chart, pie chart, table chart and pivot table chart and the characteristics of various charts
Hands-on Exercise: Plot charts in Qlik Sense like line chart, bar chart, pie chart, table chart and pivot table chart
Understanding what is a KPI chart, gauge chart, scatter plots chart and map chart/geo map
Hands-on Exercise: Plot a KPI chart, gauge chart, scatter plots chart and map chart/geo map
Introduction to the Qlik Sense Master Library, its benefits, distinct features and user-friendly applications
Hands-on Exercise: Explore and use Qlik Sense Master Library
Understanding how to do storytelling in Qlik Sense and the creation of storytelling and story playback
Hands-on Exercise: Use the storytelling feature of Qlik Sense, create a story and playback the story
Understanding mashups in Qlik Sense, creating a single graphical interface from more than one sources, deploying the mashups flowchart, testing of mashups and the various mashup scenarios like simple and normal
Hands-on Exercise: Create a single graphical interface from more than one sources, deploy the mashups flowchart and test mashups
Understanding the Qlik Sense Extension, working with it, various templates in Qlik Sense Extension, testing of it, making Hello World dynamic and learning how it works and adding a preview image
Hands-on Exercise: Work with Qlik Sense Extension, use a template in Qlik Sense Extension and test it, make Hello World dynamic and add a preview image
Various security aspects of Qlik Sense, content security, security rules, various components of security rules and understanding data reductions and dynamic data reductions and the user access workflow
Hands-on Exercise: Create security rules in Qlik Sense and understand data reductions and dynamic data reductions and the user access workflow
Objective: This project involves working with the Qlik Sense dashboard that displays the sales details whether order-wise, year-wise, customer-wise sales or product-wise sales and so on, doing comparative analysis, rolling six months analysis that should be displaying the trend of sales and placing the worksheets in a user story and publishing.
Domain: Data Analytics
Objective: To see the current values of salaries in one column and historical values in another cell in a chart that would contain a bar chart and a trend chart
Objective: Visual Mapping between the vaccination rate and measles outbreak
Introduction to Excel spreadsheet, learning to enter data, filling of series and custom fill list, editing and deleting fields.
Learning about relative and absolute referencing, the concept of relative formulae, the issues in relative formulae, creating of absolute and mixed references and various other formulae.
Creating names range, using names in new formulae, working with the name box, selecting range, names from a selection, pasting names in formulae, selecting names and working with Name Manager.
the various logical functions in Excel, the If function for calculating values and displaying text, nested If functions, VLookUp and IFError functions.
Learning about conditional formatting, the options for formatting cells, various operations with icon sets, data bars and color scales, creating and modifying sparklines.
multi-level drop down validation, restricting value from list only, learning about error messages and cell drop down.
Introduction to the various formulae in Excel like Sum, SumIF & SumIFs, Count, CountA, CountIF and CountBlank, Networkdays, Networkdays International, Today & Now function, Trim (Eliminating undesirable spaces), Concatenate (Consolidating columns)
Introduction to dynamic table in Excel, data conversion, table conversion, tables for charts and VLOOKUP.
Sorting in Excel, various types of sorting including, alphabetical, numerical, row, multiple column, working with paste special, hyperlinking and using subtotal.
The concept of data filtering, understanding compound filter and its creation, removing of filter, using custom filter and multiple value filters, working with wildcards.
Creation of Charts in Excel, performing operations in embedded chart, modifying, resizing, and dragging of chart.
Introduction to the various types of charting techniques, creating titles for charts, axes, learning about data labels, displaying data tables, modifying axes, displaying gridlines and inserting trendlines, textbox insertion in a chart, creating a 2-axis chart, creating combination chart.
The concept of Pivot tables in Excel, report filtering, shell creation, working with Pivot for calculations, formatting of reports, dynamic range assigning, the slicers and creating of slicers.
Data and file security in Excel, protecting row, column, and cell, the different safeguarding techniques.
Learning about VBA macros in Excel, executing macros in Excel, the macro shortcuts, applications, the concept of relative reference in macros.
In-depth understanding of Visual Basic for Applications, the VBA Editor, module insertion and deletion, performing action with Sub and ending Sub if condition not met.
Learning about the concepts of workbooks and worksheets in Excel, protection of macro codes, range coding, declaring a variable, the concept of Pivot Table in VBA, introduction to arrays, user forms, getting to know how to work with databases within Excel.
Learning how the If condition works and knowing how to apply it in various scenarios, working with multiple Ifs in Macro.
Understanding the concept of looping, deploying looping in VBA Macros.
Studying about debugging in VBA, the various steps of debugging like running, breaking, resetting, understanding breakpoints and way to mark it, the code for debugging and code commenting.
The concept of message box in VBA, learning to create the message box, various types of message boxes, the IF condition as related to message boxes.
Mastering the various tasks and functions using VBA, understanding data separation, auto filtering, formatting of report, combining multiple sheets into one, merging multiple files together.
Introduction to powerful data visualization with Excel Dashboard, important points to consider while designing the dashboards like loading the data, managing data and linking the data to tables and charts, creating Reports using dashboard features.
Learning to create charts in Excel, the various charts available, the steps to successfully build a chart, personalization of charts, formatting and updating features, various special charts for Excel dashboards, understanding how to choose the right chart for the right data.
Creation of Pivot Tables in Excel, learning to change the Pivot Table layout, generating Reports, the methodology of grouping and ungrouping of data.
Learning to create Dashboards, the various rules to follow while creating Dashboards, creation of dynamic dashboards, knowing what is data layout, introduction to thermometer chart and its creation, how to use alerts in the Dashboard setup.
How to insert a Scroll bar to a data window?, Concept of Option buttons in a chart, Use of combo box drop-down, List box control Usage, How to use Checkbox Control?
Understanding data quality issues in Excel, linking of data, consolidating and merging data, working with dashboards for Excel Pivot Tables.
Project – if Function
Data – Employee
Problem Statement – It describes about if function and how to implement this if function. It includes following actions:
Calculates Bonus for all employee at 10% of their salary using if Function, Rate the salesman based on the sales and the rating scale., Find the number of times “3” is repeated in the table and find the number of values greater than 5 using Count Function, Uses of Operators and nested if function
What is statistics?, How is this useful, What is this course for
Converting data into useful information, Collecting the data, Understand the data, Finding useful information in the data, Interpreting the data, Visualizing the data
Descriptive statistics, Let us understand some terms in statistics, Variable
Dot Plots, Histogram, Stemplots, Box and whisker plots, Outlier detection from box plots and Box and whisker plots
What is probability?, Set & rules of probability, Bayes Theorem
Probability Distributions, Few Examples, Student T- Distribution, Sampling Distribution, Student t- Distribution, Poison distribution
Stratified Sampling, Proportionate Sampling, Systematic Sampling, P – Value, Stratified Sampling
Cross Tables, Bivariate Analysis, Multi variate Analysis, Dependence and Independence tests ( Chi-Square ), Analysis of Variance, Correlation between Nominal variables
Project â€“ Data Analysis Project
Data â€“ Sales
Problem Statement â€“ It includes the following actions:
Understand the business solutions, Discussion with the warehouse team, Data Collection & Storage, Data Cleaning, Build a Hypothesis Tree around the business problem, Produce the final result.
Free Career Counselling
This is a comprehensive Data Analytics Certification course that is designed to clear multiple certifications viz.
The complete Data Analytics course is created by industry experts for professionals to get the top jobs in the best organizations. The entire Data Analytics online course include real world projects and case studies that are highly valuable.
Upon completion of this online Data Analytics training you will have quizzes that will help you prepare for the respective certification exams and score top marks.
The Intellipaat Data Analytics Certification is awarded upon successfully completing the project work and after reviewing by experts. The Intellipaat certification is recognized in some of the biggest companies like Cisco, Cognizant, Mu Sigma, TCS, Genpact, Hexaware, Sony, Ericsson among others.
Our Alumni works at top 3000+ companies
Intellipaat Data Analytics Online Certification Course is an industry-designed course, so you can fast-track your career in the domain of data analytics. If you don’t want to get into the nitty-gritty of programming and spending lengthy hours in coding which is needed for becoming a Data Analyst, then courses on Data Analytics is for you.
Also, in this Online Data Analytics Course,
A career in the Data Analytics domain is not just a good career option but one of the most popular careers today. You can find jobs in this domain across a diverse range of industries and companies across the globe by doing masters in data analytics.
As per the Bureau of Labor, the estimated growth rate for Data Analytics professionals will shoot up to 23% by 2026.
In order to become a Data Analyst, you must have the following qualifications:
In order to become a Data Analyst, you must meet the following criteria:
You can attain all the necessary skills, gain real-time experience, and receive a certification with Intellipaat’s Data Analytics masters course.
Having a college degree in the field of mathematics, probability, or computer science can definitely be beneficial. However, it is not mandatory for you to have the same. The main requirement of becoming a Data Analyst is that you need to possess the necessary skills. Although having a degree can help you immensely, it is still a secondary requirement.
At Intellipaat, you can enroll in either the instructor-led online training or self-paced training. Apart from this, Intellipaat also offers corporate training for organizations to upskill their workforce. All trainers at Intellipaat have 12+ years of relevant industry experience, and they have been actively working as consultants in the same domain, which has made them subject matter experts. Go through the sample videos to check the quality of our trainers.
Intellipaat is offering the 24/7 query resolution, and you can raise a ticket with the dedicated support team at anytime. You can avail of the email support for all your queries. If your query does not get resolved through email, we can also arrange one-on-one sessions with our trainers.
You would be glad to know that you can contact Intellipaat support even after the completion of the training. We also do not put a limit on the number of tickets you can raise for query resolution and doubt clearance.
Intellipaat offers self-paced training to those who want to learn at their own pace. This training also gives you the benefits of query resolution through email, live sessions with trainers, round-the-clock support, and access to the learning modules on LMS for a lifetime. Also, you get the latest version of the course material at no added cost.
Intellipaat’s self-paced training is 75 percent lesser priced compared to the online instructor-led training. If you face any problems while learning, we can always arrange a virtual live class with the trainers as well.
Intellipaat is offering you the most updated, relevant, and high-value real-world projects as part of the training program. This way, you can implement the learning that you have acquired in real-world industry setup. All training comes with multiple projects that thoroughly test your skills, learning, and practical knowledge, making you completely industry-ready.
You will work on highly exciting projects in the domains of high technology, ecommerce, marketing, sales, networking, banking, insurance, etc. After completing the projects successfully, your skills will be equal to 6 months of rigorous industry experience.
Intellipaat actively provides placement assistance to all learners who have successfully completed the training. For this, we are exclusively tied-up with over 80 top MNCs from around the world. This way, you can be placed in outstanding organizations such as Sony, Ericsson, TCS, Mu Sigma, Standard Chartered, Cognizant, and Cisco, among other equally great enterprises. We also help you with the job interview and résumé preparation as well.
You can definitely make the switch from self-paced training to online instructor-led training by simply paying the extra amount. You can join the very next batch, which will be duly notified to you.
Once you complete Intellipaat’s training program, working on real-world projects, quizzes, and assignments and scoring at least 60 percent marks in the qualifying exam, you will be awarded Intellipaat’s course completion certificate. This certificate is very well recognized in Intellipaat-affiliated organizations, including over 80 top MNCs from around the world and some of the Fortune 500companies.
Apparently, no. Our job assistance program is aimed at helping you land in your dream job. It offers a potential opportunity for you to explore various competitive openings in the corporate world and find a well-paid job, matching your profile. The final decision on hiring will always be based on your performance in the interview and the requirements of the recruiter.