Courses
Browse

Business Intelligence Architect Master's Course

Master Program

Our Business Intelligence Architect master's course lets you gain proficiency in Business Intelligence. You will work on real-world projects in Informatica, Tableau, MSBI, Power BI, MS SQL, Data warehousing and Erwin, Azure Data Factory, SQL DBA and more. In this program, you will cover 9 courses and 32 industry-based projects.

In Collaboration with course image
  • 9+

    Courses

  • 32+

    Projects

  • 150

    Hours

What you will Learn 9 Courses

  • Online Classroom Training

    • Course 1
      MS SQL
    • Course 2
      MSBI 
    • Course 3
      Tableau Desktop 10
    • Course 4
      Power BI
    • Course 5
      Informatica Developer & Admin
    • Course 6
      Azure Data Factory
  • Self Paced Training

    • Course 7
      Data Warehousing
    • Course 8
      Qlik Sense 
    • Course 9
      SQL DBA
  • Get Master's Certificate

Key Features

150 Hrs Instructor Led Training
178 Hrs Self-paced Videos
300 Hrs Project work & Exercises
Certification and Job Assistance
Flexible Schedule
Lifetime Free Upgrade
24 x 7 Lifetime Support & Access

Course Fees

Self Paced Training

  • 178 Hrs e-learning videos
  • Lifetime Free Upgrade
  • 24 x 7 Lifetime Support & Access
  • Flexi-scheduling
$799

Online Classroom preferred

  • Everything in self-paced, plus
  • 150 Hrs of Instructor-led Training
  • 1:1 Doubt Resolution Sessions
  • Attend as many batches for Lifetime
  • Flexible Schedule
  • 03 Oct
  • SAT - SUN
  • 08:00 PM TO 11:00 PM IST (GMT +5:30)
  • 11 Oct
  • SAT - SUN
  • 08:00 PM TO 11:00 PM IST (GMT +5:30)
  • 17 Oct
  • SAT - SUN
  • 08:00 PM TO 11:00 PM IST (GMT +5:30)
  • 25 Oct
  • SAT - SUN
  • 08:00 PM TO 11:00 PM IST (GMT +5:30)
$ 2878 $999 10% OFF Expires in

Corporate Training

  • Customized Learning
  • Enterprise grade learning management system (LMS)
  • 24x7 support
  • Strong Reporting

Overview

Intellipaat’s Business Intelligence Architect master’s course will provide you with in-depth knowledge on Business Intelligence and data warehousing. You will master how to design and develop enterprise class data warehouse and build reporting solution, SQL and do performance tuning in data warehouses. This program is especially designed by industry experts, and you will get 9 courses with 32 industry-based projects.

List of Courses Included

Online Instructor-led Courses:

  • MS SQL
  • MSBI
  • Tableau Desktop 10
  • Power BI
  • Informatica Developer & Admin
  • Azure Data Factory

Self-paced Courses:

  • Data Warehousing & Data Modeling
  • Qlik Sense
  • SQL DBA
  • Introduction to Business Intelligence
  • Extract, transform and load steps
  • Working with data discovery tools
  • Creating charts, reports and graphs
  • Extracting data from multiple data sources
  • Data modeling, analysis and data warehousing
  • Data Science and Big Data Professionals and Software Developers
  • Business Intelligence Professionals, Information Architects and Project Managers
  • Those who are aspiring to be a Business Intelligence Architect

There are no prerequisites for taking up this training program.

  • Worldwide Business Intelligence and Analytics Market to grow to $22.8 billion in the next 2 years – Gartner
  • The average US salary for a Microsoft BI Professional is $107,000 – Indeed

Today, there is an urgent need for Business Intelligence professionals who are well-versed with the front-end and back-end of BI. Due to this, Intellipaat master’s course in BI architecture has been created to help you gain complete proficiency in the ETL steps and in the BI reporting techniques. Taking this training will put you in a different league and help you grab top jobs.

View More

Talk To Us

Testimonials

John Chioles

Ritesh Bhagwat

Mr Yoga

Dileep & Ajay

Sagar

Ashok Guntupalli

Ramyasri Mandepudi

Recruiter at Goodwill Technologies

Thank you Intellipaat for the wonderful Informatica training. I am completely satisfied with the training and was able to grab my dream job. Thanks to this wonderful training. Great work!

Nikhil Verma

Senior BI Consultant at Tech Mahindra

This Intellipaat Informatica online training and certification course is all you will need to work with the latest Informatica PowerCenter 9.X tool. I took this training course and I am certainly elated.

Ayush Gupta

Co-Founder at Fleurish

The Intellipaat MSBI training was outstanding. The wide variety and rich quality training material, videos, PPTs and PDFs that accompanied the training were really good and helped me master this technology.

Course Content

Introduction to SQL

Various types of databases, introduction to Structured Query Language, distinction between client server and file server databases, understanding SQL Server Management Studio, SQL Table basics, data types and functions, Transaction-SQL, authentication for Windows, data control language, and the identification of the keywords in T-SQL, such as Drop Table.

Database Normalization and Entity Relationship Model

Data Anomalies, Update Anomalies, Insertion Anomalies, Deletion Anomalies, Types of Dependencies, Functional Dependency, Fully functional dependency, Partial functional dependency, Transitive functional dependency, Multi-valued functional dependency, Decomposition of tables, Lossy decomposition, Lossless decomposition, What is Normalization?, First Normal Form, Second Normal Form, Third Normal Form, Boyce-Codd Normal Form(BCNF), Fourth Normal Form, Entity-Relationship Model, Entity and Entity Set, Attributes and types of Attributes, Entity Sets, Relationship Sets, Degree of Relationship, Mapping Cardinalities, One-to-One, One-to-Many, Many-to-one, Many-to-many, Symbols used in E-R Notation.

SQL Operators

Introduction to relational databases, fundamental concepts of relational rows, tables, and columns; several operators (such as logical and relational), constraints, domains, indexes, stored procedures, primary and foreign keys, understanding group functions, the unique key, etc.

Working with SQL: Join, Tables, and Variables

Advanced concepts of SQL tables, SQL functions, operators & queries, table creation, data retrieval from tables, combining rows from tables using inner, outer, cross, and self joins, deploying operators such as ‘intersect,’ ‘except,’ ‘union,’ temporary table creation, set operator rules, table variables, etc.

Deep Dive into SQL Functions

Understanding SQL functions – what do they do?, scalar functions, aggregate functions, functions that can be used on different datasets, such as numbers, characters, strings, and dates, inline SQL functions, general functions, and duplicate functions.

Working with Subqueries

Understanding SQL subqueries, their rules; statements and operators with which subqueries can be used, using the set clause to modify subqueries, understanding different types of subqueries, such as where, select, insert, update, delete, etc., and methods to create and view subqueries.

SQL Views, Functions, and Stored Procedures

Learning SQL views, methods of creating, using, altering, renaming, dropping, and modifying views; understanding stored procedures and their key benefits, working with stored procedures, studying user-defined functions, and error handling.

Deep Dive into User-defined Functions

User-defined functions; types of UDFs, such as scalar, inline table value, multi-statement table, stored procedures and when to deploy them, what is rank function?, triggers, and when to execute triggers?

SQL Optimization and Performance

SQL Server Management Studio, using pivot in MS Excel and MS SQL Server, differentiating between Char, Varchar, and NVarchar, XL path, indexes and their creation, records grouping, advantages, searching, sorting, modifying data; clustered indexes creation, use of indexes to cover queries, common table expressions, and index guidelines.

Managing Data with Transact-SQL

Creating Transact-SQL queries, querying multiple tables using joins, implementing functions and aggregating data, modifying data, determining the results of DDL statements on supplied tables and data, and constructing DML statements using the output statement.

Querying Data with Advanced Transact-SQL Components

Querying data using subqueries and APPLY, querying data using table expressions, grouping and pivoting data using queries, querying temporal data and non-relational data, constructing recursive table expressions to meet business requirements, and using windowing functions to group and rank the results of a query.

Programming Databases Using Transact-SQL

Creating database programmability objects by using T-SQL, implementing error handling and transactions, implementing transaction control in conjunction with error handling in stored procedures, and implementing data types and NULL.

Designing and Implementing Database Objects

Designing and implementing relational database schema; designing and implementing indexes, learning to compare between indexed and included columns, implementing clustered index, and designing and deploying views and column store views.

Implementing Programmability Objects

Explaining foreign key constraints, using T-SQL statements, usage of Data Manipulation Language (DML), designing the components of stored procedures, implementing input and output parameters, applying error handling, executing control logic in stored procedures, and designing trigger logic, DDL triggers, etc.

Managing Database Concurrency

Applying transactions, using the transaction behavior to identify DML statements, learning about implicit and explicit transactions, isolation levels management, understanding concurrency and locking behavior, and using memory-optimized tables.

Optimizing Database Objects

Accuracy of statistics, formulating statistics maintenance tasks, dynamic management objects management, identifying missing indexes, examining and troubleshooting query plans, consolidating the overlapping indexes, the performance management of database instances, and SQL server performance monitoring.

Advanced Topics

Corelated Subquery, Grouping Sets, Rollup, Cube

Hands-on Exercise

Implementing Corelated Subqueries, Using EXISTS with a Correlated subquery, Using Union Query, Using Grouping Set Query, Using Rollup, Using CUBE to generate four grouping sets, Perform a partial CUBE.

Microsoft Courses: Study Material

  • Performance Tuning and Optimizing SQL Databases
  • Querying Data with Transact-SQL

Writing Complex Subqueries

In this project, you will be working with SQL subqueries and utilizing them in various scenarios. You will learn to use IN or NOT IN, ANY or ALL, EXISTS or NOT EXISTS, and other major queries. You will be required to access and manipulate datasets, operate and control statements in SQL, execute queries in SQL against databases.

Querying a Large Relational Database

This project is about how to get details about customers by querying the database. You will be working with Table basics and data types, various SQL operators, and SQL functions. The project will require you to download a database and restore it on the server, query the database for customer details and sales information.

Relational Database Design

In this project, you will learn to convert a relational design that has enlisted within its various users, user roles, user accounts, and their statuses into a table in SQL Server. You will have to define relations/attributes, primary keys, and create respective foreign keys with at least two rows in each of the tables.

Introduction to Data Visualization and Power of Tableau

What is data visualization?, comparison and benefits against reading raw numbers, real use cases from various business domains, some quick and powerful examples using Tableau without going into the technical details of Tableau, installing Tableau, Tableau interface, connecting to DataSource, Tableau data types, and data preparation.

Architecture of Tableau

Installation of Tableau Desktop, architecture of Tableau, interface of Tableau (Layout, Toolbars, Data Pane, Analytics Pane, etc.) how to start with Tableau, and the ways to share and export the work done in Tableau.

Hands-on Exercise: Play with Tableau desktop, learn about the interface, and share and export existing works.

Working with Metadata and Data Blending

Connection to Excel, cubes and PDFs, management of metadata and extracts, data preparation, Joins (Left, Right, Inner, and Outer) and Union, dealing with NULL values, cross-database joining, data extraction, data blending, refresh extraction, incremental extraction, how to build extract , etc.

Hands-on Exercise: Connect to Excel sheet to import data, use metadata and extracts, manage NULL values, clean up data before using, perform the join techniques, execute data blending from multiple sources , etc.

Creation of Sets

Mark, highlight, sort, group, and use sets (creating and editing sets, IN/OUT, sets in hierarchies), constant sets, computed sets, bins, etc.

Hands-on Exercise: Use marks to create and edit sets, highlight the desired items, make groups, apply sorting on results, and make hierarchies among the created sets.

Working with Filters

Filters (addition and removal), filtering continuous dates, dimensions, and measures, interactive filters, marks card, hierarchies, how to create folders in Tableau, sorting in Tableau, types of sorting, filtering in Tableau, types of filters, filtering the order of operations, etc.

Hands-on Exercise: Use the data set by date/dimensions/measures to add filter, use interactive filter to view the data, customize/remove filters to view the result, etc.

Organizing Data and Visual Analytics

Using Formatting Pane to work with menu, fonts, alignments, settings, and copy-paste; formatting data using labels and tooltips, edit axes and annotations, k-means cluster analysis, trend and reference lines, visual analytics in Tableau, forecasting, confidence interval, reference lines, and bands.

Hands-on Exercise: Apply labels and tooltips to graphs, annotations, edit axes’ attributes, set the reference line, and perform k-means cluster analysis on the given dataset.

Working with Mapping

Working on coordinate points, plotting longitude and latitude, editing unrecognized locations, customizing geocoding, polygon maps, WMS: web mapping services, working on the background image, including add image, plotting points on images and generating coordinates from them; map visualization, custom territories, map box, WMS map; how to create map projects in Tableau, creating dual axes maps, and editing locations.

Hands-on Exercise: Plot longitude and latitude on a geo map, edit locations on the geo map, custom geocoding, use images of the map and plot points, find coordinates, create a polygon map, and use WMS.

Working with Calculations and Expressions

Calculation syntax and functions in Tableau, various types of calculations, including Table, String, Date, Aggregate, Logic, and Number; LOD expressions, including concept and syntax; aggregation and replication with LOD expressions, nested LOD expressions; levels of details: fixed level, lower level, and higher level;  quick table calculations, the creation of calculated fields, predefined calculations, and how to validate.

Working with Parameters

Creating parameters, parameters in calculations, using parameters with filters, column selection parameters, chart selection parameters, how to use parameters in the filter session, how to use parameters in calculated fields, how to use parameters in reference line, etc.

Hands-on Exercise: Creating new parameters to apply on a filter, passing parameters to filters to select columns, passing parameters to filters to select charts, etc.

Charts and Graphs

Dual axes graphs, histograms: single and dual axes; box plot; charts: motion, Pareto, funnel, pie, bar, line, bubble, bullet, scatter, and waterfall charts; maps: tree and heat maps; market basket analysis (MBA), using Show me; and text table and highlighted table.

Hands-on Exercise: Plot a histogram, tree map, heat map, funnel chart, and more using the given dataset and also perform market basket analysis (MBA) on the same dataset.

Dashboards and Stories

Building and formatting a dashboard using size, objects, views, filters, and legends; best practices for making creative as well as interactive dashboards using the actions; creating stories, including the intro of story points; creating as well as updating the story points, adding catchy visuals in stories, adding annotations with descriptions; dashboards and stories: what is dashboard?, highlight actions, URL actions, and filter actions, selecting and clearing values, best practices to create dashboards, dashboard examples; using Tableau workspace and Tableau interface; learning about Tableau joins, types of joins; Tableau field types, saving as well as publishing data source, live vs extract connection, and various file types.

Hands-on Exercise: Create a Tableau dashboard view, include legends, objects, and filters, make the dashboard interactive, and use visual effects, annotations, and description s to create and edit a story.

Tableau Prep

Introduction to Tableau Prep, how Tableau Prep helps quickly combine join, shape, and clean data for analysis, creation of smart examples with Tableau Prep, getting deeper insights into the data with great visual experience, making data preparation simpler and accessible, integrating Tableau Prep with Tableau analytical workflow, and understanding the seamless process from data preparation to analysis with Tableau Prep.

Integration of Tableau with R and Hadoop

Introduction to R language, applications and use cases of R, deploying R on the Tableau platform, learning R functions in Tableau, and the integration of Tableau with Hadoop.

Hands-on Exercise: Deploy R on Tableau, create a line graph using R interface, and also connect Tableau with Hadoop to extract data.

Tableau Projects Covered

Understanding the global covid-19 mortality rates

Analyze and develop a dashboard to understand the covid-19 global cases.Compare the global confirmed vs. death cases in a world map. Compare the country wise cases using logarithmic axes. Dashboard should display both a log axis chart and a default axis chart in an alternate interactive way. Create a parameter to dynamically view Top N WHO regions based on cumulative new cases and death cases ratio. Dashboard should have a drop down menu to view the WHO region wise data using a bar chart, line chart or a map as per user’s requirement.

Understand the UK bank customer data

Analyze and develop a dashboard to understand the customer data of a UK bank. Create an asymmetric drop down of Region with their respective customer names and their Balances with a gender wise color code. Region wise bar chart which displays the count of customers based on High and low balance. Create a parameter to let the users’ dynamically decide the limit value of balance which categorizes it into high and low. Include interactive filters for Job classifications and Highlighters for Region in the final dashboard.

Understand Financial Data

Create an interactive map to analyze the worldwide sales and profit. Include map layers and map styles to enhance the visualization. Interactive analysis to display the average gross sales of a product under each segment, allowing only one segment data to be displayed at once. Create a motion chart to compare the sales and profit through the years. Annotate the day wise profit line chart to indicate the peaks and also enable drop lines. Add go to URL actions in the final dashboard which directs the user to the respective countries Wikipedia page.

Understand Agriculture Data

Create interactive tree map to display district wise data. Tree maps should have state labels. On hovering on a particular state, the corresponding districts data are to be displayed. Add URL actions, which direct users’ to a Google search page of the selected crop. Web page is to be displayed on the final dashboard. Create a hierarchy of seasons, crop categories and the list of crops under each. Add highlighters for season. One major sheet in the final dashboard should be unaffected by any action applied. Use the view in this major sheet to filter data in the other. Using parameters color code the seasons with high yield and low yield based on its crop categories. Rank the crops based on their yield

Introduction to Power BI

Introduction to Microsoft Power BI, the key features of Power BI workflow, Desktop application, BI service, and file data sources, sourcing data from web (OData, Azure), building dashboard, data visualization, publishing to cloud, DAX data computation, row context, filter context, Analytics Pane, creating columns and measures, data drill down and drill up, creating tables, binned tables, data modeling and relationships, the Power BI components like Power View, Map, Query, Pivot, Power Q & A, understanding advanced visualization.

Hands-on Exercise – Demo of building a Power BI dashboard, Source data from web, Publish to cloud, Create power tables

Extracting Data

Learning about Power Query for self-service ETL functionalities, introduction to data mashup, working with Excel data, learning about Power BI Personal Gateway, extracting data from files, folders and databases, working with Azure SQL database and database source, connecting to Analysis Services, SaaS functionalities of Power BI.

Hands-on Exercise – Connect to a database, Import data from an excel file, Connect to SQL Server, Analysis Service, Connect to Power Query, Connect to SQL Azure, Connect to Hadoop

Power Query for Data Transformation

Installing Power BI, the various requirements and configuration settings, the Power Query, introduction to Query Editor, data transformation – column, row, text, data type, adding & filling columns and number column, column formatting, transpose table, appending, splitting, formatting data, Pivot and UnPivot, Merge Join,  relational operators, date, time calculations, working with M functions, lists, records, tables, data types, and generators, Filters & Slicers, Index and Conditional Columns, Summary Tables, writing custom functions and error handling, M advanced data transformations.

Hands-on Exercise – Install PowerBI Desktop and configure the settings, Use Query editor, Write a power query, Transpose a table

Power Pivot for Data Modeling and Data Analysis Expression - DAX Queries

Introduction to Power Pivot, learning about the xVelocity engine, advantages of Power Pivot, various versions and relationships, strongly typed datasets, Data Analysis Expressions, Measures, Calculated Members, Row, Filter & Evaluation Context, Context Interactions, Context over Relations, Schema Relations, learning about Table, Information, Logical, Text, Iterator, Table, and Time Intelligence Functions, Cumulative Charts, Calculated Tables, ranking and rank over groups, Power Pivot advanced functionalities, date and time functions, DAX advanced features, embedding Power Pivot in Power BI Desktop.

Hands-on Exercise – Create a Power Pivot Apply filters, Use advanced functionalities like date and time functions, Embed Power Pivot in Power BI Desktop, Create DAX queries for calculate column, tables and measures

Data Visualization with Analytics

Deep dive into Power BI data visualization, understanding Power View and Power Map, Power BI Desktop visualization, formatting and customizing visuals, visualization interaction, SandDance visualization, deploying Power View on SharePoint and Excel, top down and bottom up analytics, comparing volume and value-based analytics, working with Power View to create Reports, Charts, Scorecards, and other visually rich formats, categorizing, filtering and sorting data using Power View, Hierarchies, mastering the best practices, Custom Visualization, Authenticate a Power BI web application, Embedding dashboards in applications

Hands-on Exercise – Create a Power View and a Power Map, Format and customize visuals, Deploy Power View on SharePoint and Excel, Implement top-down and bottom-up analytics, Create Power View reports, Charts, Scorecards, Add a custom visual to report, Authenticate a Power BI web application, Embed dashboards in applications, Categorize, filter and sort data using Power View, Create hierarchies, Use date hierarchies, use business hierarchies, resolve hierarchy issues

Power Q & A

Introduction to Power Q & A, intuitive tool to answer tough queries using natural language, getting answers in the form of charts, graphs and data discovery methodologies, ad hoc analytics building, Power Q & A best practices, integrating with SaaS applications

Hands-on Exercise – Write queries using natural language, Get answers in the form of charts, graphs, Build ad hoc analytics, Pin a tile and a range to dashboard

Power BI Desktop & Administration

Getting to understand the Power BI Desktop, aggregating data from multiple data sources, how Power Query works in Power BI Desktop environment, learning about data modeling and data relationships, deploying data gateways, scheduling data refresh, managing groups and row level security, datasets, reports and dashboards, working with calculated measures, Power Pivot on Power BI Desktop ecosystem, mastering data visualization, Power View on Power BI Desktop, creating real world solutions using Power BI

Hands-on Exercise – Configure security for dashboard Deploy data gateways, Aggregate data from multiple data sources, Schedule data refresh Manage groups and row level security, datasets, reports and dashboards, Work with calculated measures

Microsoft Course

Analyzing Data with Power BI.

Power BI Projects Covered

Report on Student Survey

There are many stores in which a survey was conducted based on students i.e. How much they are spending on different kinds of purchases like Video games, Indoor games, Toys, Books, Gadgets, etc. You have to create a Power BI Report. You will get hands-on experience on Tabular Visualization, Matrix Visualization, Funnel chart, Pie chart, Scatter plot, Sand dance plot

Case Study 1 - Power BI Desktop, Cloud Service, and End to End Workflow

The case study deals with ways to design a dashboard with a basic set of visualizations and deploy it to Power BI Cloud service. Further, a top-level brief overview of Transport Corp Data is shown using aggregated Key Performance Indicators (KPIs), Trends, Gio Distributions, and Filters.

Case Study 2 - Visualizations, Configuring Extended Properties, and Data Calculations DAX - Introduction

This case study explains the way to design a dashboard and perform calculations by making use of Power BI DAX formulas. The scheduled deliveries of loads are analyzed using correlation across measures. Moreover, Drill Up/Drill Down’s capabilities and reference lines are implemented.

Case Study 3 - Combination Visualizations for Correlated Value Columns

Here, the Dashboard is designed by making use of Power BI DAX formulas to perform calculations. Bucketed Categories are created to represent value measures on the categories axis. Furthermore, a scatter plot is used to identify outliers or outperformers.

Case Study 4 - Data Transformations

The case study involves designing an audit dashboard by making use of Power Query, Query Editor to perform data modeling by applying Data transformations, in turn, by managing relationships.

Case Study 5 - Data Transformations - Contd.

Here, the Dashboard is designed to analyze the trend of admissions into a State University. Query Editor is used to perform data modeling by applying transformations like append data, split data, column formatting, transpose table, pivot/unpivot, fill columns, merge join, conditional columns, index columns, and summary tables.

Case Study 6 - Advanced Visualizations

Design a dashboard to analyze the trend of admissions into state universities for analyzing advanced visualizations. And also, make use of expressions and filters to build custom visualizations.

Case Study 7 - Advanced Features of Power BI Cloud Service

Knowledge check on Ad hoc analytics with Power BI Q&A, exploring Dashboard Notification and Alerts, acquiring the data from Google Analytics, and customizing the pre-loaded visualizations appropriately.

Case Study 8 - Advanced Features of Power BI Desktop Client

Perform knowledge check on advanced features of Power BI Desktop Client by means of integrating custom visualizations and creating sand dance visualizations.

Case Study 9 - Top-Down and Bottoms Up Analysis to identify Shipping Costs Leakages

Build a set of visualizations to identify underlying outliers and flip the same set of visualizations to perform top-down and bottom-up analysis of Power BI Dashboard.

Case Study 10 - Value & Volume-based analysis on hospital records to analyze Charges vs Patients Inflow

In this case study, candidates will build a set of visualizations linked to dynamic measures to flip value and volume-based analytics on the demand from the user.

Data Warehousing and Cleansing Concepts

What is data warehousing, understanding the extract, transform and load processes, what is data aggregation, data scrubbing and data cleansing and the importance of Informatica PowerCenter ETL

Informatica Installation and Configuration

Configuring the Informatica tool and how to install the Informatica operational administration activities and integration services

Hands-on Exercise: Step-by-step process for the installation of Informatica PowerCenter

Working with Active and Passive Transformatio

Understanding the difference between active and passive transformations and the highlights of each transformation

Working with Expression Transformation

Learning about expression transformation and connected passive transformation to calculate value on a single row

Hands-on Exercise: Calculate value on a single row using connected passive transformation

Working with Sorter, Sequence Generator and Filter Transformation

Different types of transformations like sorter, sequence generator and filter, the characteristics of each and where they are used

Hands-on Exercise: Transform data using the filter technique, use a sequence generator and use a sorter

Working with Joiner Transformation

Working with joiner transformation to bring data from heterogeneous data sources

Hands-on Exercise: Use joiner transformation to bring data from heterogeneous data sources

Working with Ranking and Union Transformation

Understanding the ranking and union transformation, the characteristics and deployment

Hands-on Exercise: Perform ranking and union transformation

Syntax for Rank and Dense Rank

Learn the rank and dense rank functions and the syntax for them

Hands-on Exercise: Perform rank and dense rank functions

Router Transformation

Understanding how router transformation works and its key features

Hands-on Exercise: Perform router transformation

Source Qualifier Transformation and Mappings

Lookup transformation overview and different types of lookup transformations: connected, unconnected, dynamic and static

Hands-on Exercise: Perform lookup transformations: connected, unconnected, dynamic and static

Slowly Changing Dimension in Informatica

What is SCD, processing in xml, learn how to handle a flat file, list and define various transformations, implement ‘for loop’ in PowerCenter, the concepts of pushdown optimization and partitioning, what is constraint-based loading and what is incremental aggregation

Hands-on Exercise: Load data from a flat file, implement ‘for loop’ in PowerCenter, use pushdown optimization and partitioning, do constraint-based data loading and use incremental aggregation technique to aggregate data

Mapplet and Loading to Multiple Designer

Different types of designers: Mapplet and Worklet, target load plan, loading to multiple targets and linking property

Hands-on Exercise: Create a mapplet and a worklet, plan a target load and load multiple targets

Performance Tuning in Informatica

Objectives of performance tuning, defining performance tuning and learning the sequence for tuning

Hands-on Exercise: Do performance tuning by following different techniques

Repository Manager

Managing repository, Repository Manager: the client tool, functionalities of previous versions and important tasks in Repository Manager

Hands-on Exercise: Manage tasks in Repository Manager

Best Practices in Informatica

Understanding and adopting best practices for managing repository

Workflow Informatica

Common tasks in workflow manager, creating dependencies and the scope of workflow monitor

Hands-on Exercise: Create workflow with dependencies of nodes

Parameters and Variables

Define the variable and parameter in Informatica, parameter files and their scope, the parameter of mapping, worklet and session parameters, workflow and service variables and basic development errors

Hands-on Exercise: Define variables and parameters in functions, use the parameter of mapping, use worklet and session parameters and use workflow and service variables

Error Handling and Recovery in Informatica

Session and workflow log, using debuggers, error-handling framework in Informatica and failover and high availability in Informatica

Hands-on Exercise: Debug development errors, read workflow logs and use the error-handling framework

High Availability and Failover in Informatica

Configurations and mechanisms in recovery and checking health of PowerCenter environment

Hands-on Exercise: Configure recovery options and check health of PowerCenter environment

Working with Different Utilities in Informatica

Using commands: infacmd, pmrep and infasetup and processing of a flat file

Hands-on Exercise: Use commands: infacmd, pmrep and infasetup

Flat File Processing (Advanced Transformations)

Fixed length and delimited, expression transformations: sequence numbers and dynamic targeting using transaction control

Hands-on Exercise: Perform expression transformations: sequence numbers and dynamic targeting using transaction control

Dynamic Targeting

Dynamic target with the use of transaction control and indirect loading

Hands-on Exercise: Use of transaction control with dynamic target and indirect loading

Working with Java Transformations

Importance of Java transformations to extend PowerCenter capabilities, transforming data and active and passive mode

Hands-on Exercise: Use Java transformations to extend PowerCenter capabilities

Unconnected Stored Procedure Usage

Understanding the unconnected stored procedure in Informatica and different scenarios of unconnected stored procedure usage

Hands-on Exercise: Use the unconnected stored procedure in Informatica in different scenarios

Advanced Concepts in SCD

Using SQL transformation (active and passive)

Hands-on Exercise: Use SQL transformation (active and passive)

Incremental Data Loading and Aggregation

Understanding incremental loading and aggregation and comparison between them

Hands-on Exercise: Do incremental loading and aggregation

Constraint-based Loading

Working with database constraints using PowerCenter and understanding constraint-based loading and target load order

Hands-on Exercise: Perform constraint-based loading in a given order

XML Transformation and Active Lookup

Various types of XML transformation in Informatica and configuring a lookup as active

Hands-on Exercise: Perform XML transformation and configure a lookup as active

Profiling in PowerCenter

Understanding what data profiling in Informatica is, its significance in validating content and ensuring quality and structure of data as per business requirements

Hands-on Exercise: Create data profiling in Informatica and validate the content

Workflow Creation and Deletion

Understanding workflow as a group of instructions/commands for integration services and learning how to create and delete workflow in Informatica

Hands-on Exercise: Create and delete workflow in Informatica

Database Connection

Understanding the database connection, creating a new database connection in Informatica and understanding various steps involved

Hands-on Exercise: Create a new database connection in Informatica

Relational Database Tables

Working with relational database tables in Informatica, mapping for loading data from flat files to relational database files

Hands-on Exercise: Create mapping for loading data from flat files to relational database files

LinkedIn Connection

Understanding how to deploy PowerCenter for seamless LinkedIn connectivity with Informatica PowerCenter

Hands-on Exercise: Deploy PowerCenter for seamless LinkedIn connectivity with Informatica PowerCenter

Connection with Sources

Connecting Informatica PowerCenter with various data sources like social media channels such as Facebook, Twitter, etc.

Hands-on Exercise: Connect Informatica PowerCenter with various data sources like social media channels such as Facebook, Twitter, etc.

Pushdown Optimization and Partitioning

Pushdown optimization for load-balancing on the server for better performance and various types of partitioning for optimizing performance

Hands-on Exercise: Optimize using pushdown technique for load-balancing on the server for better performance and create various types of partitioning for optimizing performance

Cache Management

Understanding session cache, the importance of cache creation, implementing session cache and calculating cache requirement

Hands-on Exercise: Implement cache creation and work with session cache

What projects I will be working on this Informatica training?

Project 1: Admin Console

Problem Statement:It includes following actions:

  • Creation of users
  • Building roles
  • Forming groups
  • Collaboration of users, roles and groups
  • Lock handling
  • Creating sessions, workflow and worklets

Project 2: Deploying Informatica ETL for Business Intelligence

Industry:General

Problem Statement: Disparate data needs to be converted into insights using Informatica

Topics: In this Informatica project, you have access to all environments like dev, QA, UAT and production. You will first configure all the repositories in various environments. You will receive the requirement from client through source to target mapping sheet. You will extract data from various source systems and fetch it into staging. From staging, it will go to the operational data store; from there, the data will go to the enterprise data warehouse, and from there it will be directly deployed for generating reports and deriving business insights.

Highlights:

  • Access data from multiple sources
  • Manage current and historic data with SCD
  • Import source and target tables

Project 3: Deploying the ETL Transactions on Healthcare Data

Industry:Healthcare

Problem Statement: How to systematically load data within a hospital scenario so that it is easily available

Topics: In this Clinical Research Data Warehouse (CRDW) Informatica project, you will be working on various types of data coming from diverse sources. The warehouse contains remitted claims that are both approved or disapproved for end-user reporting. You will create CRDW load schedules that are on daily, weekly and monthly bases.

Highlights:

  • Extracting data from multiple sources
  • Cleansing data and putting in right format
  • Loading the data into the CRDW

Case Study 1

Project:Banking Products Augmentation

Industry: Banking

Problem Statement: How to improve the profits of a bank by customizing the products and adding new products based on customer needs

Topics:In this Informatica case study, you will construct a multidimensional model for the bank. You will create a set of diagrams depicting the star-join schemas needed to streamline the products as per customer requirements. You will implement slowly changing dimensions, understand the customer–account relationships and create diagram for the description of the hierarchies. You will also recommend heterogeneous products for the customers of the bank.

Highlights:

  • Deploy a star-join schema
  • Create demographic mini-dimensions
  • Informatica aggregator transformations

Case Study 2

Project:Employee Data Integration

Industry:General

Problem Statement:How to load a table with employee data using Informatica

Topics:In this Informatica case study, you will create a design for a common framework that can be used for loading and updating the employee ID and other details lookup for multiple shared tables. Your design will address the regular loading of shared tables. You will also keep a track of when the regular load runs, when the lookup requests run, prioritization of requests if needed and so on.

Highlights:

  • Creating multiple shared tables
  • Plug-and-play capability of the framework
  • Code and framework reusability

Module 01 - Non-Relational Data Stores and Azure Data Lake Storage

1.1 Document data stores
1.2 Columnar data stores
1.3 Key/value data stores
1.4 Graph data stores
1.5 Time series data stores
1.6 Object data stores
1.7 External index
1.8 Why NoSQL or Non-Relational DB?
1.9 When to Choose NoSQL or Non-Relational DB?

  • Best Uses
  • Scenarios

1.10 Azure Data Lake Storage

  • Definition
  • Azure Data Lake-Key Components
  • How it stores data?
  • Azure Data Lake Storage Gen2
  • Why Data Lake?
  • Data Lake Architecture

Module 02 - Data Lake and Azure Cosmos DB

2.1 Data Lake Key Concepts
2.2 Azure Cosmos DB
2.3 Why Azure Cosmos DB?
2.4 Azure Blob Storage
2.5 Why Azure Blob Storage?
2.6 Data Partitioning

  • Horizontal partitioning
  • Vertical partitioning
  • Functional partitioning

2.7 Why Partitioning Data?
2.8 Consistency Levels in AzureCosmos DB

  • Semantics of the five-consistency level

Hands-on:
1. Load Data fromAmazonS3 to ADLS Gen2 with Data Factory
2. Working with Azure Cosmos DB

Module 03 - Relational Data Stores

3.1 Introduction to Relational Data Stores
3.2 Azure SQL Database

  • Deployment Models
  • Service Tiers

Hands-on:
1. Create a Single Database Using Azure Portal
2. Create a managed instance
3. Create an elastic pool

3.3 Why SQL Database Elastic Pool?

Hands-on:
1. Create a SQL virtual machine
2. Configure active geo-replication for Azure SQL Database in the Azure portal and initiate failover.

Module 04 - Why Azure SQL?

4.1 Azure SQL Security Capabilities
4.2 High-Availability and Azure SQL Database

  • Standard Availability Model
  • Premium Availability Model

4.3 Azure Database for MySQL

Hands-on:
1. Design an Azure Database for MySQL database using the Azure portal
2. Connect using MySQL Workbench

4.4 Azure Database for PostgreSQL
Hands-on:
1. Design an Azure Database for PostgreSQL – Single Server

4.5 Azure Database For MariaDB
Hands-on:
1. Create an Azure Database for MariaDB server by using the Azure portal

4.6 What is PolyBase?

  • Why PolyBase?

4.7 What is Azure Synapse Analytics (formerly SQL DW)?

  • SQL Analytics and SQL pool in Azure Synapse
  • Key component of a big data solution
  • SQL Analytics MPP architecture components

Hands-on:
1. Import Data From Blob Storage to Azure Synapse Analytics by Using PolyBase

Module 05 - Azure Batch

5.1 What is Azure Batch?
5.2 Intrinsically Parallel Workloads
5.3 Tightly Coupled Workloads
5.4 Additional Batch Capabilities
5.5 Working of Azure Batch

Hands-on:
1. Run a batch job using Azure Portal
2. Parallel File Processing with Azure Bath using the .NET API
3. Render a Blender Scene using Batch Explorer
4. Parallel R Simulation with Azure Batch

Module 06 - Azure Data Factory

6.1 Flow Process of Data Factory
6.2 Why Azure Data Factory
6.3 Integration Runtime in Azure Data Factory
6.4 Mapping Data Flows

Hands-on:
1. Transform data using Mapping data flows

Module 07 - Azure Data Bricks

7.1 What is Azure Databricks?
7.2 Azure Spark-based Analytics Platform
7.3 Apache Spark in Azure Databricks

Hands-on:
1. Run a Spark Job on Azure Databricks using the Azure portal
2. ETL Operation by using Azure Databricks
3. Stream data into Azure Databricks using Event Hubs

Module 08 - Azure Stream Analytics

8.1 Working of Stream Analytics
8.2 Key capabilities and benefits
Hands-on:
1. Analyse phone call data with stream analytics and visualize results in Power BI dashboard

8.3 Stream Analytics Windowing Functions

  • Tumbling window
  • Hopping Window
  • Sliding Window
  • Session Window

Module 09 - Monitoring & Security

9.1 What is Azure Monitor?

  • Metrics
  • Logs
  • Metrics Vs Logs

9.2 What data does Azure Monitor collect?
9.3 What can you Monitor?

  • Insights and Core Solutions

9.4 Alerts in Azure

  • Flow of Alerts
  • Key Attributes of an Alert Rule
  • What can you set alert on?
  • Manage alerts
  • Alert States
  • How to create an alert?

Hands-on:
1. Create, View, and Manage Metric alerts using Azure Monitor
2. Monitor your Azure Data Factory Pipelines proactively with Alerts

9.5 Azure Security Logging & Auditing

  • Types of Logs in Azure
  • Azure SQL Database Auditing
  • Server-level vs. Database-level Auditing Policy

Hands-on:
1. Azure SQL Database Auditing

What projects I will be working on this Azure Data Factory training?

In this Azure Data Factory Project, you are supposed to automate the transformation of the real-time video list from the YouTube channel. You will be storing multiple files at the dynamic location of Azure Data Lake Store and the same needs to transformed and copied to any data store. The list of the channels should be displayed on PowerBI dynamically.
Project 01: Fetch the list of videos from the attached dataset of YouTube channel with the highest views and likes to promote advertisements on the channel which has maximum traffic.

Topics: Azure Data Factory, Azure Data Lake, Triggers, SQL, Power BI

Highlights:
1.1 Creating Azure Data Factory
1.2 Creating Pipelines
1.3 Creating a trigger that runs a pipeline on a schedule
1.4 Transforming data using SQL
1.5 Connecting Azure Data Lake to Power BI

Project 02: Working with Azure Data Factory, Data Lake and Azure SQL

Industry: General

Problem Statement: You are working as an Azure Architect for Zendrix Corp. This company is a service-based company and has its major revenue from the sales it makes for its subscription-based service.

The company needs to continuously monitor its lead flow from different countries. This helps them in strategizing how much they need to invest in Ad-Marketing for a particular country, this, in turn, helps them to achieve their desired sales targets.

Currently, the company has to manually synchronize data from their live SQL database to their BI tool, for checking the lead flow from different countries.

The company wants an automated solution, using which they will be able to see a live dashboard of the lead count. You as an Architect have suggested the following things:

Highlights:
2.1 Use of Power BI Heat maps
2.2 Use of Azure SQL instead of On-Premise SQL
2.3 Use of Data Factory to automate the data lifecycle from SQL to the BI tool.

Help them achieve the above goals.

Project 03: Identify the videos that get maximum traffic in selected YouTube channels

Industry: Marketing

Problem Statement: Getting the real-time list of maximum traffic fetching videos from YouTube channels to promote advertisements in the same channels (traffic should be considered on a weekly basis)

Description: There is a company ‘XYZ Pvt. Ltd’ that promotes advertisements in the maximum traffic generating YouTube channels (on a weekly basis) to drive profits. To maximize profitability, the marketing team that manages the posting of advertisements requires an interface using which they can get a real-time list of YouTube channels for promoting advertisements and monitoring the analytics of traffic on those channels.

Objective: As an Azure Data Factory specialist, you are supposed to automate the transformation of the real-time video list from YouTube channels on a weekly basis. This will help the marketing team promote advertisements on the right YouTube videos on targeted channels.

Note: The traffic can be analyzed on various parameters like the number of views, and likes or comments on a particular day. You can get these publicly available data from the YouTube API.

Case Study 01: Non-Relational Data Stores

Problem Statement: Knowledge check of non-relational databases: Categories and where to use them

Topics: NoSQL or Non-Relational Database, Azure Data Lake Storage and its key components.

Highlights:
1.1 Scenarios where you can use NoSQL or Non-Relational Database.
1.2 categories of Non-Relational or No SQL databases with relevant Azure services.
1.3 Azure Data Lake Storage and its key components.

Case Study 02: Non-Relational Data Stores

Problem Statement: Copy data from Azure Blob Storage to Azure Data Lake Storage Gen2; Create an Azure Cosmos DB account and Demonstrate adding and removing regions from your Database account; Strategies for Partitioning data; Semantics of consistency levels in Cosmos DB

Topics: Azure Cosmos DB, Azure Data Factory, Blob Storage, Strategies for Partitioning Data, Semantics of consistency levels in Cosmos DB

Highlights:
2.1 Azure Blob Storage
2.2 Azure Data Lake Storage Gen2
2.3 Azure Cosmos DB
2.4 Partitioning data
2.5 Consistency levels

Case Study 03: Relational Data Stores

Problem Statement: Knowledge check of Relational databases: Deployment models in Azure SQL; Create an elastic pool, Azure SQL Security Capabilities; Import Data From Blob Storage to Azure Synapse Analytics by Using PolyBase

Topics: Azure SQL, PolyBase, Azure Synapse Analytics

Highlights:
3.1 Deployment models in Azure SQL
3.2 Elastic Pool
3.3 Azure Synapse Analytics
3.4 PolyBase

Case Study 04: Azure Batch, Azure Data Factory

Problem Statement: Working of Azure Batch; Flow Process of Data Factory; Types of Integration Runtime in Azure Data Factory; Transform data using Mapping data flows

Topics: Azure Batch, Data Factory, Integration Runtime, Mapping Data Flows

Highlights:
4.1 Working of Azure Batch
4.2 Integration Runtime in Azure Data Factory
4.3 Transform data using Mapping data flows

Case Study 05: Azure Data Bricks, Azure Stream Analytics

Problem Statement: ETL Operation by using Azure Databricks; Working of Stream Analytics; Stream Analytics Windowing Functions

Topics: Azure Data Bricks, Azure Stream Analytics, Windowing Functions

Highlights:
5.1 ETL operation by using Azure Databricks
5.2 Working of Stream Analytics
5.3 Windowing Functions

Case Study 06: Monitoring & Security

Problem Statement: Create, View, and Manage Metric alerts using Azure Monitor; Azure SQL Database Auditing

Topics: Azure Monitor, Alerts in Azure, Azure Security Logging & Auditing

Highlights:
6.1 Azure Monitor
6.2 Alerts
6.3 Azure SQL Database Auditing

MSBI SSIS Course Content

What is BI?

Introduction to Business Intelligence, understanding the concept of Data Modeling, Data Cleaning, learning about Data Analysis, Data Representation, Data Transformation.

ETL Overview

Introduction to ETL, the various steps involved Extract, Transform, Load, using a user’s email ID to read a flat file, extracting the User ID from email ID, loading the data into a database table.

Working with Connection Managers

Introduction to Connection Managers – logical representation of a connection, the various types of Connection Managers – Flat file, database, understanding how to load faster with OLE DB, comparing the performance of OLE DB and ADO.net, learning about Bulk Insert, working with Excel Connection Managers and identifying the problems.

Data Transformations

Learning what is Data Transformation, converting data from one format to another, understanding the concepts of Character Map, Data Column and Copy Column Transformation, import and export column transformation, script and OLEDB Command Transformation, understanding row sampling, aggregate and sort transformation, percentage and row sampling.

Advance Data Transformation

Understanding Pivot and UnPivot Transformation, understanding Audit and Row Count Transformation, working with Split and Join Transformation, studying Lookup and Cache Transformation, Integrating with Azure Analysis Services, elastic nature of MSBI to integrate with the Azure cloud service, scale out deployment option for MSBI, working with cloud-borne data sources and query analysis. Scaling out the SSIS package, deploying for tighter windows, working with larger amount of data sources, SQL Server vNext for enhancing SQL Server features, more choice of development languages and data types both on-premise and in the cloud.

Slowly Changing Dimensions

Understanding data that slowly changes over time, learning the process of how new data is written over old data, best practices.Detail explanation of three types of SCDs –Type1, Type2 and Type3, and their differences.

Overview of Fuzzy Look-up Transformation and Lookup and Term Extraction

Understanding how Fuzzy Lookup Transformation varies from Lookup Transformation, the concept of Fuzzy matching

Concepts of Logging & Configuration

Learning about error rows configuration, package logging, defining package configuration, understanding constraints and event handlers.

MSBI SSRS Course Content

Introduction to SSRS

Get introduced to the SSRS Architecture, components of SSRS Report Building tool, learning about the data flow in different components.

Matrix and Tablix Overview

Understanding the concepts of Matrix and Tablix, working with Text Box, learning about formatting, row/column grouping, understanding sorting, formatting, concepts of Header, Footer, Totals, Subtotals and Page Breaks.

Parameters and Expression

Learning about Parameters, filter and visibility expression, understanding drill-through and drill-down, defining variables, custom code.

Reports and Charts creation

Introduction to various aspects of Bar Chart, Line Chart, Combination Chart, Shape Chart, Sub Reports,Integration of Power Query and M language with SSRS, working with additional data sources in MSBI, rich transformation capabilities addition to MSBI, reusing M functions build for PBIX in SSRS.

Dashboard Building

Learn how to build a Dashboard with Sparklines, Data Bars, Map Charts, Gauge Charts and drilling into reports, the basics of ad hoc reporting.

Data Bar, Sparkline, Indicator, Gauge Chart, Map Chart, Report Drilling, What is Ad hoc reporting?

Reports and Authenticity

Understanding Report Cache, Authorization, Authentication and Report Snapshot, learning about Subscriptions and Site Security.

MSBI SSAS Course Content

Getting started with SSAS

Understanding the concept of multidimensional analysis, understanding SSAS Architecture and benefits, learn what is Cube, working with Tables and OLAP databases, understanding the concept of Data Sources, working with Dimension Wizard, understanding Dimension Structure, Attribute Relationships, flexible and rigid relationship.

Structures and Processes

Learning about Process Dimension, the Process database, creation of Cube, understanding Cube Structure, Cube browsing, defining the various categories, Product Key and Customer Key, Column Naming, processing and deploying a Cube, Report creation with a Cube.

Hands-on Exercise – Create a Cube and name various columns Deploy a cube after applying keys and other rules Create reports with a cube

Type of Database Relationship

Understanding Data Dimensions and its importance, the various relationships, regular, referenced, many to many, fact, working on Data Partitions, and Data Aggregations.

SSAS Cube

Learning about SSAS Cube, the various types of Cubes, the scope of Cube and comparison with Data Warehouse.

Cube: Operations & Limitations

The various operations on Cube, the limitations of OLAP Cubes, the architecture of in-memory analytics and its advantages.

Cube and In-memory Analytics

Deploying cube with existing data warehouse capabilities to get self-service business intelligence, understanding how in-memory analytics works.

Hands-on Exercise – Deploy cube to get self-service business intelligence

Data Source View

Logical model of the schema used by the Cube, components of Cube, understanding Named Queries and Relationships.

Dimensions

An overview of the Dimensions concept, describing the Attributes and Attributes Hierarchies, understanding Key/Value Pairs, Metadata Reload, logical keys and role-based dimensions.

Hands-on Exercise – Create role based dimensions, Use Attributes Hierarchies

Measures & Features of Cube

Understanding the Measure of Cube, analyzing the Measure, exploring the relationship between Measure and Measure Group, Cube features and Dimension usage.

Measures and Features of Cube Cont.

Working with Cube Measures, deploying analytics, understanding the Key Performance Indicators, deploying actions and drill-through actions on data, working on data partitions, aggregations, translations and perspectives.

Hands-on Exercise – Work with Cube Measures, Deploy analytics, Deploy actions and drill-through actions on data, Make data partitions

Working with MDX

Understanding Multidimensional Expressions language, working with MDX queries for data retrieval, working with Clause, Set, Tuple, Filter condition in MDX.

Hands-on Exercise – Apply Clause, Set and filter condition in MDX query to retrieve data

Functions of MDX

Learning about MDX hierarchies, the functions used in MDX, Ancestor, Ascendant and Descendant function, performing data ordering

Hands-on Exercise – Create MDX hierarchies, Perform data ordering in ascending order, in descending order

DAX language

Data Analysis Expressions (DAX), Using the EVALUATE and CALCULATE functions, filter DAX queries, create calculated measures, perform data analysis by using DAX

Hands-on Exercise – Use the EVALUATE and CALCULATE functions, filter DAX queries, create calculated measures, perform data analysis by using DAX

BI Semantic Model

Designing and publishing a tabular data model, Designing measures relationships, hierarchies, partitions, perspectives, and calculated columns

Hands-on Exercise – Design and publish a tabular data model, Design measures relationships, hierarchies, partitions, perspectives, and calculated columns

Plan and deploy SSAS

Configuring and maintaining SQL Server Analysis Services (SSAS), Non-Union Memory Architecture (NUMA), Monitoring and optimizing performance, SSAS Tabular model with vNext, Excel portability, importing model from Power BI Desktop, importing a Power Pivot model, bidirectional cross-filtering relationship in MSBI.

Hands-on Exercise – Configure and maintain SQL Server Analysis Services (SSAS), Monitor and optimize performance

Analyzing Big Data with Microsoft R

Reading data with R Server from SAS, txt, or excel formats, converting data to XDF format; Summarizing data, rxCrossTabs versus rxCube, extracting quantiles by using rxQuantile; Visualizing data (rxSummary and rxCube, rxHistogram and rxLinePlot) Processing data with rxDataStep Performing transforms using functions transformVars and transformEnvir Processing text using RML packages Building predictive models with ScaleR Performing in-database analytics by using SQL Server

Hands-on Exercise – Read data with R Server from SAS, txt or excel formats, convert data to XDF format; Summarize data, Extract quantiles by using rxQuantile; Visualize data (rxSummary, rxCube, rxHistogram and rxLinePlot) Perform transforms using functions transformVars and transformEnvir Build predictive models with ScaleR Perform in-database analytics by using SQL Server

Microsoft Courses ( Self-Paced Course)

Analyzing Data with SQL Server Reporting Services

What projects I will be working on this MSBI training?

In this MSBI project, you will be creating data flow tasks in SSIS, creating SSAS Cubes,  creating an SSRS Report.

Project 1:  SSIS

Problem Statement: Create a data flow task to extract data from the XLS format and store it into the SQL database, store the subcategory and category-wise sales in a table of the database. Once you get the output, split the dataset into two other tables. Table 1 should contain three columns (Sales < 100,000), Category, and Subcategory. Table 2 should contain (Sales > 100,000), Subcategory, and Category columns. Also, the Sales column should be sorted in both tables. Divide the whole dataset into a ratio of 70:30 percent and store the results in two different tables in the database

Topics: Data FlowODBC Set up and Connection Manager, Flat File Connection, Transformation, Import Export Transformation, Split and Join Transformation, Merge and Union All Transformation

Highlights:

  • Creating a Data Flow Task
  • ODBC Set up and Connection Manager
  • Transformations

Project 2:  SSRS:

Problem Statement: In the United States, there are many stores in which a survey was conducted based on students.  By using data set (Student Survey), try to extract meaningful Insights by creating an SSRS Report to show Tabular Visualization, Matrix Visualization, Funnel chart, Pie chart, Scatter plot, Drill-down.

Topics: Report Creation, Deployment, Concepts of Reporting Services, Tablix and Matrix, Expression and Parameters, Charts & Reports

Highlights:

  • Tabular Visualization
  • Matrix Visualization
  • Transformations
  • Funnel chart
  • Pie chart
  • Scatter plot
  • Drill down

Project 3:  SSAS:

Problem Statement: Using Adventure Works DW 2014 database, Build a Cube to show the number of products there for each color, total Sales Amount for each currency, Count of products there for each product name. Create a reference relationship between the Product category table and subcategory table hence show how many subcategories are there for each category. Build the partitions for sales amount <= 700 and sales amount > 700. Make the aggregations and stop when performance gain reaches 40%. Build a perspective (Subset of the above cube) with limited measures and Dimension fields.

Topics: Dimensions, Data Dimensions & Cubes, Aggregations, Measures & features of Cube, SSAS Perspectives

Highlights:

  • Data Dimensions & Cubes
  • SSAS Perspectives

Case Study 1: SSIS

Problem Statement: Create the connection of OLDB & load the data in SQL Server from excel; Create transformation where you have to split the people’s age group; How to create constants and events in package; Create a project level and package parameter at the package level; How to extract the data in an Incremental Order;

Topics: Data Flow, ODBC Set up and Connection Manager, Transformation, Split & Join Transformation, Term Extraction and Lookup

Highlights:

  • ODBC Set up and Connection Manager
  • Transformations

Case Study 2: SSRS

Problem Statement: Steps to add a correlated column chart; how to create a report server project; Use of Data Connections, Performance point Content library; Steps to create drill-down charts; How to pass the parameter from the main chart to detail chart (pie); Functions of Data bars and Sparklines; Usage of KPI box in SSRS Dashboard;

Topics: Concepts of Reporting Services, Report Creation, Expression and Parameters, Report and Authentication, Deployment

Highlights:

  • Expression and parameters
  • Report and Authentication
  • Deployment

Case Study 3: SSAS

Problem Statement: Knowledge check on data mart, measures, dimension, cube, KPI’s, perspectives in SSIS

Topics:  Dimensions, Data Dimensions & Cubes, Measures & features of Cube, SSAS Perspectives

Highlights:

  • Data Dimensions & Cubes
  • SSAS Perspectives

Qlik Sense Introduction and Installation

How does Qlik Sense vary from QlikView, the need for self-service Business Intelligence/Business Analytics tools, Qlik Sense data discovery, intuitive tool for dynamic dashboards and personalized reports and the installation of Qlik Sense and Qlik Sense Desktop

Hands-on Exercise: Install Qlik Sense and Qlik Sense Desktop

Qlik Sense Features

Drag-and-drop visualization, Qlik Data indexing engine, data dimensions relationships, connect to multiple data sources, creating your own dashboards, data visualization, visual analytics and the ease of collaboration

Hands-on Exercise: Connect to a database or load data from an Excel file and create a dashboard

Qlik Sense Data Model

Understand data modeling, best practices, turning data columns into rows, converting data rows into fields, hierarchical-level data loading, loading new or updated data from database, using a common field to combine data from two tables and handling data inconsistencies

Hands-on Exercise: Turn data columns into rows, convert data rows into fields, load the data in hierarchical level, load new or updated data from database and use a common field to combine data from two tables

Creating a Data Model

Qlik Sense data architecture, understanding QVD layer, converting QlikView files to Qlik Sense files and working on synthetic keys and circular references

Hands-on Exercise: Convert QlikView files to Qlik Sense files and resolve synthetic keys and circular references

Advanced Data Modeling

Qlik Sense star schema, link table, dimensions table, master calendar, QVD files and optimizing data modeling

Hands-on Exercise: Create a Qlik Sense star schema, create link table, dimensions table, master calendar and QVD files

Qlik Sense Enterprise

Qlik Sense enterprise class tools, Qlik Sense custom app, embedding visuals, rapid development, powerful open APIs, enterprise-class architecture, Big Data integration, enterprise security and elastic scaling

Qlik Sense Visualization

Learning about Qlik Sense visualization tools, charts and maps creation, rich data storytelling and sharing analysis visually with compelling visualizations

Hands-on Exercise: Create charts and maps, create a story around dataset and share analysis

Set Analysis

Understanding set analysis in Qlik Sense, various parts of a set expression like identifiers, operators, modifiers and comparative analysis

Hands-on Exercise: Do Set Analysis in Qlik Sense, use set expression like identifiers, operators, modifiers and comparative analysis

Advanced Set Analysis

Learning about set analysis which is a way of defining a set of data values different from normal set, deploying comparison sets and point-in-time analysis

Hands-on Exercise: Deploy comparison sets and perform point-in-time analysis

Qlik Sense Charts

Introduction to various charts in Qlik Sense like line chart, bar chart, pie chart, table chart and pivot table chart and the characteristics of various charts

Hands-on Exercise: Plot charts in Qlik Sense like line chart, bar chart, pie chart, table chart and pivot table chart

Advanced Charts

Understanding what is a KPI chart, gauge chart, scatter plots chart and map chart/geo map

Hands-on Exercise: Plot a KPI chart, gauge chart, scatter plots chart and map chart/geo map

Master Library

Introduction to the Qlik Sense Master Library, its benefits, distinct features and user-friendly applications

Hands-on Exercise: Explore and use Qlik Sense Master Library

Qlik Sense Storytelling

Understanding how to do storytelling in Qlik Sense and the creation of storytelling and story playback

Hands-on Exercise: Use the storytelling feature of Qlik Sense, create a story and playback the story

Mashups

Understanding mashups in Qlik Sense, creating a single graphical interface from more than one sources, deploying the mashups flowchart, testing of mashups and the various mashup scenarios like simple and normal

Hands-on Exercise: Create a single graphical interface from more than one sources, deploy the mashups flowchart and test mashups

Extensions

Understanding the Qlik Sense Extension, working with it, various templates in Qlik Sense Extension, testing of it, making Hello World dynamic and learning how it works and adding a preview image

Hands-on Exercise: Work with Qlik Sense Extension, use a template in Qlik Sense Extension and test it, make Hello World dynamic and add a preview image

Security

Various security aspects of Qlik Sense, content security, security rules, various components of security rules and understanding data reductions and dynamic data reductions and the user access workflow

Hands-on Exercise: Create security rules in Qlik Sense and understand data reductions and dynamic data reductions and the user access workflow

What projects I will be working on this Qlik Sense training?

Project 1

Objective: This project involves working with the Qlik Sense dashboard that displays the sales details whether order-wise, year-wise, customer-wise sales or product-wise sales and so on, doing comparative analysis, rolling six months analysis that should be displaying the trend of sales and placing the worksheets in a user story and publishing.

Project 2

Domain: Data Analytics

Objective: To see the current values of salaries in one column and historical values in another cell in a chart that would contain a bar chart and a trend chart
Project 3

Domain: Healthcare

Objective: Visual Mapping between the vaccination rate and measles outbreak

Installation and Configuration

  • Plan Installation

Evaluate installation requirements; design the installation of SQL Server and its components (drives, service accounts, etc.); plan scale-up vs. scale-out basics; plan for capacity, including if/when to shrink, grow, autogrow, and monitor growth; manage the technologies that influence SQL architecture (e.g., service broker, full text, scale out, etc.); design the storage for new databases (drives, filegroups, partitioning, etc.); design the database infrastructure; configure an SQL Server standby database for reporting purposes; Windows-level security and service-level security; core mode installation; benchmark a server before using it in a production environment (SQLIO, Tests on SQL Instance, etc.); and choose the right hardware

  • Installing SQL Server and Related Services

Test connectivity; enable and disable features; install SQL Server database engine and SSIS (but not SSRS and SSAS); and configure an OS disk

  • Implementing a Migration Strategy

Restore vs. detach/attach; migrate security; migrate from a previous version; migrate to new hardware; and migrate systems and data from other sources

  • Configuring Additional SQL Server Components

Set up and configure all SQL Server components (Engine, AS, RS, and SharePoint integration) in a complex and highly secure environment; configure full-text indexing; SSIS security; filestream; and filetable

  • Manage SQL Server Agent

Create, maintain, and monitor jobs; administer jobs and alerts; automate (setup, maintenance, monitoring) across multiple databases and multiple instances; send to “Manage SQL Server Agent jobs”

Managing Instances and Databases

  • Managing and Configuring Databases

Design multiple file groups; database configuration and standardization: autoclose, autoshrink, recovery models; manage file space, including adding new filegroups and moving objects from one filegroup to another; implement and configure contained databases; data compression; configure TDE; partitioning; manage log file growth; DBCC

  • Configuring SQL Server Instances

Configure and standardize a database: autoclose, autoshrink, recovery models; install default and named instances; configure SQL to use only certain CPUs (affinity masks, etc.); configure server level settings; configure many databases/instance, many instances/server, virtualization; configure clustered instances including MSDTC; memory allocation; database mail; configure SQL Server engine: memory, filffactor, sp_configure, default options

  • Implementing an SQL Server Clustered Instance

Install a cluster; manage multiple instances on a cluster; set up subnet clustering; recover from a failed cluster node

  • Managing SQL Server Instances

Install an instance; manage interaction of instances; SQL patch management; install additional instances; manage resource utilization by using Resource Governor; cycle error logs

Optimizing and Troubleshooting

  • Identifying and Resolving Concurrency Problems

Examine deadlocking issues using the SQL server logs using trace flags; design reporting database infrastructure (replicated databases); monitor via DMV or other MS product; diagnose blocking, live locking and deadlocking; diagnose waits; performance detection with built in DMVs; know what affects performance; and locate and if necessary kill processes that are blocking or claiming all resources

  • Collecting, Analyzing, and Troubleshooting Data

Monitor using Profiler; collect performance data by using System Monitor; collect trace data by using SQL Server Profiler; identify transactional replication problems; identify and troubleshoot data access problems; gather performance metrics; identify potential problems before they cause service interruptions; identify performance problems;, use XEvents and DMVs; create alerts on critical server condition; monitor data and server access by creating audit and other controls; identify IO vs. memory vs. CPU bottlenecks; and use the Data Collector tool

  • Auditing SQL Server Instances

Implement a security strategy for auditing and controlling the instance; configure an audit; configure server audits; track who modified an object; monitor elevated privileges as well as unsolicited attempts to connect; and policy-based management

Managing Data

  • Configuring and Maintaining a Back-up Strategy

Manage different backup models, including point-in-time recovery; protect customer data even if backup media is lost; perform backup/restore based on proper strategies including backup redundancy; recover from a corrupted drive; manage a multi-TB database; implement and test a database implementation and a backup strategy (multiple files for user database and tempdb, spreading database files, backup/restore); back up a SQL Server environment; and back up system databases

  • Restoring Databases

Restore a database secured with TDE; recover data from a damaged DB (several errors in DBCC checkdb); restore to a point in time; file group restore; and page-level restore

  • Implementing and Maintaining Indexes

Inspect physical characteristics of indexes and perform index maintenance; identify fragmented indexes; identify unused indexes; implement indexes; defrag/rebuild indexes; set up a maintenance strategy for indexes and statistics; optimize indexes (full, filter index); statistics (full, filter) force or fix queue; when to rebuild vs. reorg and index; full text indexes; and column store indexes

  • Importing and Exporting Data

Transfer data; bulk copy; and bulk insert

Implementing Security

  • Managing Logins and Server Roles

Configure server security; secure the SQL Server using Windows Account / SQL Server accounts, server roles; create log in accounts; manage access to the server, SQL Server instance, and databases; create and maintain user-defined server roles; and manage certificate logins

  • Managing Database Security

Configure database security; database level, permissions; protect objects from being modified; auditing; and encryption

  • Managing Users and Database Roles

Create access to server / database with least privilege; manage security roles for users and administrators; create database user accounts; and contained login

  • Troubleshooting Security

Manage certificates and keys, and endpoints

Implementing High Availability

  • Implementing AlwaysOn
    • Implement AlwaysOn availability groups and AlwaysOn failover clustering
  • Implementing replication
    • Troubleshoot replication problems and identify appropriate replication strategy

What projects I will be working on this MS SQL Server DBA training?

Project : SQL Server Audit

Industry : General

Problem Statement : How to track and log events happening on the database engine

Topics : This project is involved with implementing an SQL Server audit that includes creating of the TestDB database, triggering audit events from tables, altering audit, checking, filtering, etc. You will learn to audit an SQL Server instance by tracking and logging the events on the system. You will work with SQL Server Management; learn about database level and Server level auditing.

Highlight :

  • SQL Server Management Studio
  • Expanding SQL Server Log Folder
  • Database & Server audit specification.

Project 2 : Managing SQL Server for a high tech company

Industry :  Information Technology

Problem Statement :  An IT company wants to manage its MS SQL Server database and gain valuable insights from it.

Description :  In this project you will be administrating the MS SQL Server database. You will learn about the complete architecture of MS SQL Server. You will be familiarized with the enterprise edition of SQL Server, various tools of SQL Server, creating and modifying databases in real-time.

Highlights :

  • Create database schema in SQL Server
  • Adding, removing, moving database files
  • Database backup and recovery.

Introduction to Data Warehouse

Introducing Data Warehouse and Business Intelligence, understanding difference between database and data warehouse, working with ETL tools, SQL parsing.

Architecture of Data Warehouse

Understanding the Data Warehousing Architecture, system used for Reporting and Business Intelligence, understanding OLAP vs. OLTP, introduction to Cubes.

Data Modeling concepts

The various stages from Conceptual Model, Logical Model to Physical Schema, Understanding the Cubes, benefits of Cube, working with OLAP multidimensional Cube, creating Report using a Cube.

Data Normalization

Understanding the process of Data Normalization, rules of normalization for first, second and third normal, BCNF, deploying Erwin for generating SQL scripts.

Dimension & Fact Table

The main components of Business Intelligence – Dimensions and Fact Tables, understanding the difference between Fact Tables & Dimensions, understanding Slowly Changing Dimensions in Data Warehousing.

SQL parsing, Cubes & OLAP

SQL parsing, compilation and optimization, understanding types and scope of cubes, Data Warehousing Vs. Cubes, limitations of Cubes and evolution of in-memory analytics.

Erwin Design Layer Architecture (self paced)

Learning the Erwin model, understanding the Design Layer Architecture, data warehouse modeling, creating and designing user defined domains, managing naming and data type standards.

Forward & Reverse Engineering (self paced)

Understanding of the forward and reverse engineering, comparison between the two.

What projects I will be working on this Data Warehouse training?

Project 1– Logical & Physical Data Modeling Using ERWin

Data – Invoice Management (Sales)

Topics–This project is involved with creating logical and physical data model using the CA Erwin data modeler design layer architecture. You will learn about the techniques for turning a logical model into a physical design. With this project you will be well-versed in the process of reverse and forward engineering. You will understand both the top down and bottom up design methodology.

Project 2– End-to-End implementation of Data Warehouse (Retail Store)

Data –Sales

Topics – In this project you will learn about the process of loading data into a data warehouse using the ETL tools. You will learn about the ways to create and deploy the data warehouse. Oracle provides support for multiple physical but only one logical model in the data warehouse. This project will provide you extensive experience in integrating, cleansing, customization and insertion of data in the data warehouse for a retail store.

View More

Free Career Counselling

Certification

Business Intelligence Architect master’s course is a comprehensive course that is designed to clear multiple certifications such as:

  • Microsoft SQL Server Certification Exam
  • Tableau Desktop Qualified Associate Exam
  • MCSE: Business Intelligence Certification Exam
  • Informatica PowerCenter Developer and Administrator Certification
  • Microsoft 70-761 and 70-762 SQL Server Certification Exam
  • Microsoft 70-778 Certification Exam
  • Microsoft DP-200 Certification Exam

The entire course content is in line with respective certification programs and helps you clear the requisite certification exam with ease and get the best jobs in top MNCs.

As part of this training, you will be working on real-time projects and assignments that have immense implications in the real-world industry scenarios, thus helping you fast-track your career effortlessly.

At the end of this training program, there will be quizzes that perfectly reflect the type of questions asked in the respective certification exams and helps you score better marks.

Intellipaat Course Completion Certificate will be awarded upon the completion of the project work (after expert review) and upon scoring at least 60% marks in the quiz. Intellipaat certification is well recognized in top 80+ MNCs like Ericsson, Cisco, Cognizant, Sony, Mu Sigma, Saint-Gobain, Standard Chartered, TCS, Genpact, Hexaware, etc.

Our Alumni works at top 3000+ companies

client-desktop client-mobile

Course Advisor

Suresh Paritala

Suresh Paritala

Solutions Architect at Microsoft, USA

A Senior Software Architect at NextGen Healthcare who has previously worked with IBM Corporation, Suresh Paritala has worked on Big Data, Data Science, Advanced Analytics, Internet of Things and Azure, along with AI domains like Machine Learning and Deep Learning. He has successfully implemented high-impact.

David Callaghan

David Callaghan

Big Data Solutions Architect, USA

An experienced Blockchain Professional who has been bringing integrated Blockchain, particularly Hyperledger and Ethereum, and Big Data solutions to the cloud, David Callaghan has previously worked on Hadoop, AWS Cloud, Big Data and Pentaho projects that have had major impact on revenues of marquee brands around the world.

Samanth Reddy

Data Team Lead at Sony, USA

A renowned Data Scientist who has worked with Google and is currently working at ASCAP, Samanth Reddy has a proven ability to develop Data Science strategies that have a high impact on the revenues of various organizations. He comes with strong Data Science expertise and has created decisive Data Science strategies for Fortune 500 corporations.

Frequently Asked Questions

What is Intellipaat’s Master’s Course and how it is different from individual courses?

Intellipaat’s Master’s Course is a structured learning path especially designed by industry experts which ensures that you transform into a Business Intelligence expert. Individual courses at Intellipaat focus on one or two specializations. However, if you have to master Business Intelligence, then this program is for you.

Intellipaat’s Masters Course is a structured learning path specially designed by industry experts which ensures that you transform into Business Intelligence expert. Individual courses at Intellipaat focus on one or two specializations. However, if you have to master Business Intelligence then this program is for you

Intellipaat is the pioneer of Business Intelligence Architect training we provide:

  • Project work & Assignment – You will work on 20 industry based project which will give you hands on experience on the technology
  • 24*7 Support – Our Team work 24*7 to clear all your doubts
  • Free Course Upgrade – Keep yourself updated with latest version hence it’s a lifetime investment at one go
  • Flexible Schedule –You can attend as many batches as you want or if you are busy then you can postpone your classes to our next available batches without any extra charges.
  • Resume Preparation & Job Assistance –We will help you to prepare your resume and market your profile for jobs. We have more than 80 clients across the globe (India, US, UK, etc.) and we circulate our learner’s profiles to them.

Intellipaat offers the self-paced training and online instructor-led training.

Data Warehouse & Erwin, MS SQL, Tableau, MSBI, Informatica Developer & Admin, Power BI are online instructor-led courses

Data Warehousing, Microstrategy, QlikView are self-paced courses

At Intellipaat, you can enroll in either the instructor-led online training or self-paced training. Apart from this, Intellipaat also offers corporate training for organizations to upskill their workforce. All trainers at Intellipaat have 12+ years of relevant industry experience, and they have been actively working as consultants in the same domain, which has made them subject matter experts. Go through the sample videos to check the quality of our trainers.

Intellipaat is offering the 24/7 query resolution, and you can raise a ticket with the dedicated support team at anytime. You can avail of the email support for all your queries. If your query does not get resolved through email, we can also arrange one-on-one sessions with our trainers.

You would be glad to know that you can contact Intellipaat support even after the completion of the training. We also do not put a limit on the number of tickets you can raise for query resolution and doubt clearance.

Intellipaat is offering you the most updated, relevant, and high-value real-world projects as part of the training program. This way, you can implement the learning that you have acquired in real-world industry setup. All training comes with multiple projects that thoroughly test your skills, learning, and practical knowledge, making you completely industry-ready.

You will work on highly exciting projects in the domains of high technology, ecommerce, marketing, sales, networking, banking, insurance, etc. After completing the projects successfully, your skills will be equal to 6 months of rigorous industry experience.

Intellipaat actively provides placement assistance to all learners who have successfully completed the training. For this, we are exclusively tied-up with over 80 top MNCs from around the world. This way, you can be placed in outstanding organizations such as Sony, Ericsson, TCS, Mu Sigma, Standard Chartered, Cognizant, and Cisco, among other equally great enterprises. We also help you with the job interview and résumé preparation as well.

You can definitely make the switch from self-paced training to online instructor-led training by simply paying the extra amount. You can join the very next batch, which will be duly notified to you.

Once you complete Intellipaat’s training program, working on real-world projects, quizzes, and assignments and scoring at least 60 percent marks in the qualifying exam, you will be awarded Intellipaat’s course completion certificate. This certificate is very well recognized in Intellipaat-affiliated organizations, including over 80 top MNCs from around the world and some of the Fortune 500companies.

Apparently, no. Our job assistance program is aimed at helping you land in your dream job. It offers a potential opportunity for you to explore various competitive openings in the corporate world and find a well-paid job, matching your profile. The final decision on hiring will always be based on your performance in the interview and the requirements of the recruiter.

View More

Talk To Us

Select Currency