Big Data Bootcamp

Short course

In London

£ 675 + VAT

Description

  • Type

    Short course

  • Level

    Beginner

  • Location

    London

  • Class hours

    21h

  • Duration

    3 Days

  • Start date

    Different dates available

This big data training course will provide a technical overview of Apache Hadoop for project managers, business managers and data analysts. Students will understand the overall big data space, technologies involved and will get a detailed overview of Apache Hadoop.

The course will expose students to real world use cases to comprehend the capabilities of Apache Hadoop. Students will also learn about YARN and HDFS and how to develop applications and analyze Big Data stored in Apache Hadoop using Apache Pig and Apache Hive. Each topic will provide hands on experience to the students

Facilities

Location

Start date

London
See map
78 Cannon Street, EC4N 6AG

Start date

Different dates availableEnrolment now open

About this course

Upon completion of this course, you will be able to:

Learning Objectives:
Learn about the big data ecosystem
Understand the benefits and ROI you can get from your existing data
Learn about Hadoop and how it is transforming the workspace
Learn about MapReduce and Hadoop Distributed File system
Learn about using Hadoop to identify new business opportunities
Learn about using Hadoop to improve data management processes
Learn about using Hadoop to clarify results
Learn about using Hadoop to expand your data sources Learn about scaling your current workflow to handle more users and lower your overall performance cost
Learn about the various technologies that comprise the Hadoop ecosystem
Learn how to write a simple mapreduce job from Java or your favorite programming language
Learn how to use a very simple scripting language to transform your data
Learn how to use a SQL like declarative language to analyze large quantities of data
Learn how to connect your existing data warehouse to the Hadoop ecosystem
Learn how to move your data to the Hadoop ecosystem
Learn how to move the results of your data analysis to Business Intelligence Tools like Tableaux
Learn how to automate your workflow using oozie Learn about polyglot persistence and identifying the right tool for the right job
Learn about future trends in Big data and technologies to keep an eye on
Discover tips and tricks behind successful Hadoop deployments

Anybody who is involved with databases, data analysis, wondering how to deal with the mountains of data (anywhere gigabytes of user/log data etc to petabytes will benefit from this program).

This course is perfect for:

Business Analysts
Software Engineers
Project Managers
Data Analysts
Business Customers
Team Leaders
System Analysts

No prior knowledge of big data and/or Hadoop is required for this class. Some prior programming experience is a plus for this class, but not necessary.

Questions & Answers

Add your question

Our advisors and other users will be able to reply to you

Who would you like to address this question to?

Fill in your details to get a reply

We will only publish your name and question

Reviews

This centre's achievements

2018

All courses are up to date

The average rating is higher than 3.7

More than 50 reviews in the last 12 months

This centre has featured on Emagister for 6 years

Subjects

  • Big Data
  • HIVE
  • Pig
  • Handoop
  • Clusters
  • Data Mining
  • SQL
  • Sqoop
  • Oozie
  • HDFS
  • Apache

Teachers and trainers (1)

Bright  Solutions

Bright Solutions

Trainer

Course programme

1. Introduction to Big Data

* Big Data – beyond the obvious trends
* Exponentially increasing data
* Big data sources
* Data warehousing, business intelligence, analytics, predictive statistics, data science

2. Survey of Big Data technologies

First generation systems
Second generation systems
Enterprise search
Visualizing and understanding data with processing
NOSQL databases
Apache Hadoop

3. Introduction to Hadoop:

What is Hadoop? Who are the major vendors?
A dive into the Hadoop Ecosystem
Benefits of using Hadoop
How to use Hadoop within your infrastructure?

4. Introduction to MapReduce

What is MapReduce?
Why do you need MapReduce?
Using Mapreduce with Java and Ruby

5.Introduction to Yarn

What is Yarn?
What are the advantages of using Yarn over classical MapReduce?
Using Yarn with Java and Ruby

6. Introduction to HDFS

What is HDFS?
Why do you need a distributed file system?
How is a distributed file system different from a traditional file system?
What is unique about HDFS when compared to other file systems?
HDFS and reliability?
Does it offer support for compressions, checksums and data integrity?

7. Data Transformation

Why do you need to transform data?
What is Pig?
Use cases for Pig
8. Structured Data Analysis?

How do you handle structured data with Hadoop?
What is Hive/HCatalog?
Use cases for Hive/HCatalog
9. Loading data into Hadoop

How do you move your existing data into Hadoop?
What is Sqoop?
10. Automating workflows in Hadoop

Benefits of Automation
What is oozie?
Automatically running workflows
Setting up workflow triggers
11. Exploring opportunities in your own organization

Framing scenarios
Understanding how to ask questions
Tying possibilities to your own business drivers
Common opportunities
Real world examples
12. Hands-on Exercises

How to use MapReduce in Hadoop?
How to use Yarn within Hadoop?
Overview of HDFS commands
Hands-on activities with Pig
Hands-on activities with Hive/HCatalog
Hands-on activities with Sqoop
Demonstration of Oozie

Big Data Bootcamp

£ 675 + VAT