Go to Courses

DCC

Diploma in Cloud Computing

Cloud Computing Duration: 4 Months

OBJECTIVE

This Course Provides a Comprehensive Understanding of Cloud Computing, Covering key Concepts, Benefits, Service Models (IaaS, PaaS, SaaS), and real-world Applications. Participants will gain hands-on experience in Data Processing, Storage, and Analytics, Preparing them for Practical Cloud-Based Solutions.

COURSE COVERAGE

1. INTRODUCTION TO CLOUD COMPUTING

BASICS OF CLOUD COMPUTING

Cloud Computing Concepts: IaaS, PaaS, SaaS, Public, Private, Hybrid Clouds - Benefits and Challenges: Scalability, Cost Efficiency, Flexibility, Security, Data Management.

2. HADOOP ECOSYSTEM

HADOOP OVERVIEW

Introduction to Hadoop: History and Evolution, Components - Architecture: HDFS, YARN, MapReduce.

HDFS & MAPREDUCE

HDFS Architecture: Nodes, Block Storage - File Operations - MapReduce Programming Model: Mapper and Reducer, Job Configuration.

DATA INGESTION & TOOLS

Flume: Setup and Usage - Sqoop: Data Import/Export - Apache Pig: Pig Latin Basics - Apache Oozie: Workflow Scheduling.

3. APACHE SPARK

SPARK OVERVIEW

Introduction: Components, Architecture - RDDs, DataFrames, Datasets - Spark SQL - Spark Streaming - Spark MLlib.

SPARK PROGRAMMING

Development: Writing Applications in Python, Scala, Java - Data Processing - Performance Tuning.

4. APACHE HIVE

HIVE BASICS

Architecture and Hive Metastore - Hive Query Language (DDL/DML) - Advanced Queries (Joins, Aggregations) - UDFs - Partitions and Bucketing.

5. OPEN-SOURCE TOOLS

NOSQL & CONTAINERIZATION

HBase Overview - Integration with Hadoop - Docker Basics: Container Management - Kubernetes Overview: Orchestration Basics.