Introduction to Hadoop and its constituent ecosystem, understanding MapReduce and HDFS, Big Data, factors constituting Big Data, Hadoop and Hadoop Ecosystem, MapReduce: concepts of Map, Reduce, ordering, concurrency, shuffle and reducing, Hadoop Distributed File System (HDFS) concepts and its importance, deep dive into MapReduce, execution framework, partitioner, combiner, data types, key pairs, HDFS deep dive: architecture, data replication, name node, data node, dataflow, parallel copying with DISTCP and Hadoop archives
Installing Hadoop in pseudo-distributed mode, understanding important configuration files, their properties and Demon Threads, accessing HDFS from Command Line, MapReduce: basic exercises, understanding Hadoop ecosystem, introduction to Sqoop, use cases and installation, introduction to Hive, use cases and installation, introduction to Pig, use cases and installation, introduction to Oozie, use cases and installation, introduction to Flume, use cases and installation and introduction to YarnMini Project:
Importing MySQL data using Sqoop and querying it using Hive
How to develop a MapReduce application, writing unit test, the best practices for developing and writing and debugging MapReduce applications
What is Pig, Pig’s features, Pig use cases, interacting with Pig, basic data analysis with Pig, Pig Latin Syntax, loading data, simple data types, field definitions, data output, viewing the schema, filtering and sorting data and commonly-used functions
Hands-on Exercise: Using Pig for ETL processing
What is Hive, Hive schema and data storage, comparing Hive to traditional databases, Hive vs. Pig, Hive use cases, interacting with Hive, relational data analysis with Hive, Hive databases and tables, Basic HiveQL Syntax, data types, joining data sets and common built-in functions
Hands-on Exercise: Running Hive queries on the Shell, Scripts and Hue
Why Hadoop testing is important, unit testing, integration testing, performance testing, diagnostics, nightly QA test, benchmark and end-to-end tests, functional testing, release certification testing, security testing, scalability testing, commissioning and decommissioning of data nodes testing, reliability testing and release testing
Understanding the requirement, preparation of the testing estimation, test cases, test data, test bed creation, test execution, defect reporting, defect retest, daily status report delivery, test completion, ETL testing at every stage (HDFS, Hive and HBase) while loading the input (logs, files, records, etc.) using Sqoop/Flume which includes but not limited to data verification, reconciliation, user authorization and authentication testing (groups, users, privileges, etc.), report defects to the development team or manager and driving them to closure, consolidate all the defects and create defect reports and validating new feature and issues in core Hadoop
Report defects to the development team or manager and driving them to closure, consolidate all the defects and create defect reports, validating new feature and issues in core Hadoop and responsible for creating a testing framework called MRUnit for testing of MapReduce programs
Automation testing using the Oozie and data validation using the query surge tool
Test plan for HDFS upgrade and test automation and result
How to test install and configure
Project 1: Working with MapReduce, Hive and Sqoop
Problem Statement: It describes how to import MySQL data using Sqoop and querying it using hive and also describes how to run the word count MapReduce job.
Project 2: Testing Hadoop Using MRUnit
Problem Statement: How to test the Hadoop application using MRUnit testing
Topics: This project involves working with MRUnit for testing the Hadoop application without spinning a cluster. You will learn how to do the map and reduce test in an application.
Intellipaat is the pioneer in Hadoop training. This is a comprehensive Hadoop testing training that will provide you with all the requisite skills for detecting, analyzing and rectifying of errors in the Hadoop cluster. You will also gain knowledge of the various components of Hadoop like HDFS, MapReduce, Hive, Sqoop, Pig, HBase, Flume and Oozie. You will master various test-case scenarios and PoC implementation. Upon the completion of the training, you will be awarded the Intellipaat Hadoop Testing Certification.
Intellipaat offers lifetime access to videos, course materials, 24/7 support and course material upgrades to the latest version at no extra fees. For Hadoop testing training, you get the Intellipaat Proprietary Virtual Machine for lifetime and free cloud access for 6 months for performing training exercises. Hence, it is clearly a one-time investment. We are also exclusively partnered with IBM for providing you with IBM Certified Hadoop Professional training as well.
This course is designed for clearing the Intellipaat Hadoop Testing Certification. The entire training course content has been designed by industry professionals in order to help you get the best jobs in the top MNCs. As part of this training, you will be working on real-time projects and assignments that have immense implications in the real-world industry scenarios, thus helping you fast track your career effortlessly.
At the end of this training program, there will be quizzes that perfectly reflect the type of questions asked in the respective certification exams and help you score better marks.
The certification will be awarded upon the completion of the project work (after expert review) and upon scoring at least 60% marks in the quiz. Intellipaat certification is well recognized in top 80+ MNCs like Ericsson, Cisco, Cognizant, Sony, Mu Sigma, Saint-Gobain, Standard Chartered, TCS, Genpact, Hexaware, etc.
A Senior Software Architect at NextGen Healthcare who has previously worked with IBM Corporation, Suresh Paritala has worked on Big Data, Data Science, Advanced Analytics, Internet of Things and Azure, along with AI domains like Machine Learning and Deep Learning. He has successfully implemented high-impact projects in major corporations around the world.
An experienced Blockchain Professional who has been bringing integrated Blockchain, particularly Hyperledger and Ethereum, and Big Data solutions to the cloud, David Callaghan has previously worked on Hadoop, AWS Cloud, Big Data and Pentaho projects that have had major impact on revenues of marquee brands around the world.