This Microsoft Azure Data Factory training will equip you with the skills to perform Big Data Engineering on Microsoft Cloud Services. You will learn how to connect Power BI with a data lake and how to use a data factory. This also covers real-time industry use cases and project work to give you hands-on experiences on using data factory and data lake and deploy the same in relevant software pipelines. Learn Azure data factory from the best Microsoft certified data factory experts.
Intellipaat Microsoft Azure DP-200 certification training gives learners the opportunity to get used to implementing Azure Data Solution. This training ensures that learners improve their skills on Microsoft Azure SQL Data Warehouse, Azure Data Lake Analytics, Azure Data Factory, and Azure Stream Analytics, and then perform data integration and copying using Hive and Spark, respectively. Also, real-world projects would be provided. Therefore, the learners will learn to design the Azure Data Solutions, Data Processing, and Data Security.
This course will prepare you for exams DP-200 for implementing and designing Azure data solutions, enabling you to design and perform data management, monitoring, security and privacy using the complete Azure data services stack. Further, you will learn the below list of topics:
1. Implement Azure Data Solution (DP 200)
Using Azure Stream Analytics, you can manage real-time event processing applications by scaling it. It promotes greater performance through partitioning, so complicated queries can be parallelized and run on many stream nodes.
With Azure Databricks, you can get the most advanced version of Apache Spark, which will enable you to seamlessly integrate it with the open-sourced libraries. Also, with Azure’s inherent scalability that surpasses every other cloud services, you can easily update the clusters in the Apache Spark.
Azure Cosmos DB leverages the Jupyter notebooks and Apache Spark to reduce the time to insights by collecting and serving data and performing analysis on local database copies in Azure regions. In this DP-200 certification, you will cover this topic in detail.
Yes, Azure SQL Database is included in this DP-200 and certification course.
This Microsoft Azure Data Factory certification is ideal for candidates who are looking to start their career or already working in the following roles:
Further, candidates who design analytics solutions and build operationalized solutions on Azure and who are familiar with the features and capabilities of batch data processing, real-time processing, and operationalization technologies, etc., can also opt for this Azure Data Engineering training course.
Relevant work experience in Data Engineering issues with Azure SQL Data Warehouse, Azure Data Lake, Azure Data Factory, and Azure Stream Analytics
Azure data engineer enables companies to convert all their big data from storage systems, relational & non-relational databases into data-driven workflows. This helps them put up concrete strategies, accomplish goals, and improve the market value of the data they own! For this, they require certified professionals.
In the United States, the average salary for a newbie in this position is about US$85,000, and in India, it is around ₹700,000 (rough estimate). In addition, experienced candidates with recognized DP-200 certification can earn up to US$120,000 in the United States and up to ₹1,500,000 in India!
Today, every company is moving towards cloud computing to meet growing customer expectations and gain a competitive advantage. Microsoft Azure is growing at an unprecedented rate. Therefore, there is an urgent need for Azure certified management professionals.
Intellipaat Azure certification training (DP-200 certification) gives you hands-on experience using Azure services, storage, servers, and more. You will work on managing virtual machines, protecting and managing identities. After you get certified, you can apply for the best jobs at top salaries.
1.1 Document data stores
1.2 Columnar data stores
1.3 Key/value data stores
1.4 Graph data stores
1.5 Time series data stores
1.6 Object data stores
1.7 External index
1.8 Why NoSQL or Non-Relational DB?
1.9 When to Choose NoSQL or Non-Relational DB?
1.10 Azure Data Lake Storage
2.1 Data Lake Key Concepts
2.2 Azure Cosmos DB
2.3 Why Azure Cosmos DB?
2.4 Azure Blob Storage
2.5 Why Azure Blob Storage?
2.6 Data Partitioning
2.7 Why Partitioning Data?
2.8 Consistency Levels in AzureCosmos DB
1. Load Data fromAmazonS3 to ADLS Gen2 with Data Factory
2. Working with Azure Cosmos DB
3.1 Introduction to Relational Data Stores
3.2 Azure SQL Database
1. Create a Single Database Using Azure Portal
2. Create a managed instance
3. Create an elastic pool
3.3 Why SQL Database Elastic Pool?
1. Create a SQL virtual machine
2. Configure active geo-replication for Azure SQL Database in the Azure portal and initiate failover.
4.1 Azure SQL Security Capabilities
4.2 High-Availability and Azure SQL Database
4.3 Azure Database for MySQL
1. Design an Azure Database for MySQL database using the Azure portal
2. Connect using MySQL Workbench
4.4 Azure Database for PostgreSQL
1. Design an Azure Database for PostgreSQL – Single Server
4.5 Azure Database For MariaDB
1. Create an Azure Database for MariaDB server by using the Azure portal
4.6 What is PolyBase?
4.7 What is Azure Synapse Analytics (formerly SQL DW)?
1. Import Data From Blob Storage to Azure Synapse Analytics by Using PolyBase
5.1 What is Azure Batch?
5.2 Intrinsically Parallel Workloads
5.3 Tightly Coupled Workloads
5.4 Additional Batch Capabilities
5.5 Working of Azure Batch
1. Run a batch job using Azure Portal
2. Parallel File Processing with Azure Bath using the .NET API
3. Render a Blender Scene using Batch Explorer
4. Parallel R Simulation with Azure Batch
6.1 Flow Process of Data Factory
6.2 Why Azure Data Factory
6.3 Integration Runtime in Azure Data Factory
6.4 Mapping Data Flows
1. Transform data using Mapping data flows
7.1 What is Azure Databricks?
7.2 Azure Spark-based Analytics Platform
7.3 Apache Spark in Azure Databricks
1. Run a Spark Job on Azure Databricks using the Azure portal
2. ETL Operation by using Azure Databricks
3. Stream data into Azure Databricks using Event Hubs
8.1 Working of Stream Analytics
8.2 Key capabilities and benefits
1. Analyse phone call data with stream analytics and visualize results in Power BI dashboard
8.3 Stream Analytics Windowing Functions
9.1 What is Azure Monitor?
9.2 What data does Azure Monitor collect?
9.3 What can you Monitor?
9.4 Alerts in Azure
1. Create, View, and Manage Metric alerts using Azure Monitor
2. Monitor your Azure Data Factory Pipelines proactively with Alerts
9.5 Azure Security Logging & Auditing
1. Azure SQL Database Auditing
Free Career Counselling
In this Azure Data Factory Project, you are supposed to automate the transformation of the real-time video list from the YouTube channel. You will be storing multiple files at the dynamic location of Azure Data Lake Store and the same needs to transformed and copied to any data store. The list of the channels should be displayed on PowerBI dynamically.
Project 01: Fetch the list of videos from the attached dataset of YouTube channel with the highest views and likes to promote advertisements on the channel which has maximum traffic.
Topics: Azure Data Factory, Azure Data Lake, Triggers, SQL, Power BI
1.1 Creating Azure Data Factory
1.2 Creating Pipelines
1.3 Creating a trigger that runs a pipeline on a schedule
1.4 Transforming data using SQL
1.5 Connecting Azure Data Lake to Power BI
Project 02: Working with Azure Data Factory, Data Lake and Azure SQL
Problem Statement: You are working as an Azure Architect for Zendrix Corp. This company is a service-based company and has its major revenue from the sales it makes for its subscription-based service.
The company needs to continuously monitor its lead flow from different countries. This helps them in strategizing how much they need to invest in Ad-Marketing for a particular country, this, in turn, helps them to achieve their desired sales targets.
Currently, the company has to manually synchronize data from their live SQL database to their BI tool, for checking the lead flow from different countries.
The company wants an automated solution, using which they will be able to see a live dashboard of the lead count. You as an Architect have suggested the following things:
2.1 Use of Power BI Heat maps
2.2 Use of Azure SQL instead of On-Premise SQL
2.3 Use of Data Factory to automate the data lifecycle from SQL to the BI tool.
Help them achieve the above goals.
Project 03: Identify the videos that get maximum traffic in selected YouTube channels
Problem Statement: Getting the real-time list of maximum traffic fetching videos from YouTube channels to promote advertisements in the same channels (traffic should be considered on a weekly basis)
Description: There is a company ‘XYZ Pvt. Ltd’ that promotes advertisements in the maximum traffic generating YouTube channels (on a weekly basis) to drive profits. To maximize profitability, the marketing team that manages the posting of advertisements requires an interface using which they can get a real-time list of YouTube channels for promoting advertisements and monitoring the analytics of traffic on those channels.
Objective: As an Azure Data Factory specialist, you are supposed to automate the transformation of the real-time video list from YouTube channels on a weekly basis. This will help the marketing team promote advertisements on the right YouTube videos on targeted channels.
Note: The traffic can be analyzed on various parameters like the number of views, and likes or comments on a particular day. You can get these publicly available data from the YouTube API.
Case Study 01: Non-Relational Data Stores
Problem Statement: Knowledge check of non-relational databases: Categories and where to use them
Topics: NoSQL or Non-Relational Database, Azure Data Lake Storage and its key components.
1.1 Scenarios where you can use NoSQL or Non-Relational Database.
1.2 categories of Non-Relational or No SQL databases with relevant Azure services.
1.3 Azure Data Lake Storage and its key components.
Case Study 02: Non-Relational Data Stores
Problem Statement: Copy data from Azure Blob Storage to Azure Data Lake Storage Gen2; Create an Azure Cosmos DB account and Demonstrate adding and removing regions from your Database account; Strategies for Partitioning data; Semantics of consistency levels in Cosmos DB
Topics: Azure Cosmos DB, Azure Data Factory, Blob Storage, Strategies for Partitioning Data, Semantics of consistency levels in Cosmos DB
2.1 Azure Blob Storage
2.2 Azure Data Lake Storage Gen2
2.3 Azure Cosmos DB
2.4 Partitioning data
2.5 Consistency levels
Case Study 03: Relational Data Stores
Problem Statement: Knowledge check of Relational databases: Deployment models in Azure SQL; Create an elastic pool, Azure SQL Security Capabilities; Import Data From Blob Storage to Azure Synapse Analytics by Using PolyBase
Topics: Azure SQL, PolyBase, Azure Synapse Analytics
3.1 Deployment models in Azure SQL
3.2 Elastic Pool
3.3 Azure Synapse Analytics
Case Study 04: Azure Batch, Azure Data Factory
Problem Statement: Working of Azure Batch; Flow Process of Data Factory; Types of Integration Runtime in Azure Data Factory; Transform data using Mapping data flows
Topics: Azure Batch, Data Factory, Integration Runtime, Mapping Data Flows
4.1 Working of Azure Batch
4.2 Integration Runtime in Azure Data Factory
4.3 Transform data using Mapping data flows
Case Study 05: Azure Data Bricks, Azure Stream Analytics
Problem Statement: ETL Operation by using Azure Databricks; Working of Stream Analytics; Stream Analytics Windowing Functions
Topics: Azure Data Bricks, Azure Stream Analytics, Windowing Functions
5.1 ETL operation by using Azure Databricks
5.2 Working of Stream Analytics
5.3 Windowing Functions
Case Study 06: Monitoring & Security
Problem Statement: Create, View, and Manage Metric alerts using Azure Monitor; Azure SQL Database Auditing
Topics: Azure Monitor, Alerts in Azure, Azure Security Logging & Auditing
6.1 Azure Monitor
6.3 Azure SQL Database Auditing
This course is designed to clear the following certifications:
You will also receive the course completion certificate by Microsoft for Performing Big Data Engineering on Microsoft Cloud Services.
The entire course is in line with the above curriculums and helps you get the best jobs in the top MNCs. As part of this training, you will be working on real-time projects and assignments that have immense implications in the real-world industry scenarios, thus helping you fast-track your career effortlessly. At the end of this training program, there will be a quiz that perfectly reflects the type of questions asked in the certification exam and helps you score better.
Intellipaat Course Completion Certificate will be awarded upon the completion of the project work (after the expert review) and upon scoring of at least 60% marks in the quiz. Intellipaat certification is well recognized in top 80+ MNCs like Ericsson, Cisco, Cognizant, Sony, Mu Sigma, Saint-Gobain, Standard Chartered, TCS, Genpact, Hexaware, etc.
Intellipaat is offering the Azure administration certification training that is in line with clearing the Microsoft Azure DP 200 certification exam. This training will equip you with all the skills needed to perform Big Data Engineering on Microsoft Cloud Services. This certification training will help you take on bigger responsibilities in the Azure administration domain, and get awarded with the DP-200 certificate.
At Intellipaat, you can enroll in either the instructor-led online training or self-paced training. Apart from this, Intellipaat also offers corporate training for organizations to upskill their workforce. All trainers at Intellipaat have 12+ years of relevant industry experience, and they have been actively working as consultants in the same domain, which has made them subject matter experts. Go through the sample videos to check the quality of our trainers.
Intellipaat is offering the 24/7 query resolution, and you can raise a ticket with the dedicated support team at anytime. You can avail of the email support for all your queries. If your query does not get resolved through email, we can also arrange one-on-one sessions with our trainers.
You would be glad to know that you can contact Intellipaat support even after the completion of the training. We also do not put a limit on the number of tickets you can raise for query resolution and doubt clearance.
Intellipaat is offering you the most updated, relevant, and high-value real-world projects as part of the training program. This way, you can implement the learning that you have acquired in real-world industry setup. All training comes with multiple projects that thoroughly test your skills, learning, and practical knowledge, making you completely industry-ready.
You will work on highly exciting projects in the domains of high technology, ecommerce, marketing, sales, networking, banking, insurance, etc. After completing the projects successfully, your skills will be equal to 6 months of rigorous industry experience.
Intellipaat actively provides placement assistance to all learners who have successfully completed the training. For this, we are exclusively tied-up with over 80 top MNCs from around the world. This way, you can be placed in outstanding organizations such as Sony, Ericsson, TCS, Mu Sigma, Standard Chartered, Cognizant, and Cisco, among other equally great enterprises. We also help you with the job interview and résumé preparation as well.
You can definitely make the switch from self-paced training to online instructor-led training by simply paying the extra amount. You can join the very next batch, which will be duly notified to you.
Once you complete Intellipaat’s training program, working on real-world projects, quizzes, and assignments and scoring at least 60 percent marks in the qualifying exam, you will be awarded Intellipaat’s course completion certificate. This certificate is very well recognized in Intellipaat-affiliated organizations, including over 80 top MNCs from around the world and some of the Fortune 500companies.
Apparently, no. Our job assistance program is aimed at helping you land in your dream job. It offers a potential opportunity for you to explore various competitive openings in the corporate world and find a well-paid job, matching your profile. The final decision on hiring will always be based on your performance in the interview and the requirements of the recruiter.