Introduction

Analytics is the systematic discovery of meaningful patterns in data to support decision-making. Data rules the business world today. Companies using data are the ones with the competitive edge. Business Decision making requires insights faster than ever. With complexity in data becoming more and more the need for data driven insights in palatable form is more and more important.

WCDE

It’s a globally recognized certificate offered by 200 year old billion dollar company WILEY. This certification helps establish you as a leader in the field, providing employers and customers with tangible evidence of your skills and expertise acting as proof of Job-readiness and Professional Competence. This certification is more than just a bullet point on your resume. It will be a proof that you are professionally, inquisitively and intellectually capable enough to handle data with moderate analytics skills for using various related tools and technology. As a WCDE holder you would be able to develop reliable, autonomous, scalable data pipelines that result in optimized data sets for a variety of workloads

Objectives

  1. Understand the concepts of big data
  2. Understand the concept of storing and processing data with Hadoop 2
  3. Demystifying concepts of Hive ,PIG, Spark and Scala
  4. Learn hadoop tools like Oozie, Flume, Zookeeper etc
  5. Understanding Storm and NoSQL

Certification:

WILEY International Certification – WCDE (Wiley Certified Data Engineer)

Who can attend

  • Software developers and Software professionals
  • Data Engineers
  • Data science enthusiasts
  • Analytics professionals with in-depth experience developing data engineering solutions and a high-level mastery

Eligibility

  • Mandatory training has to be completed from Authorized Training Partner (Pre-Requisites for certification)
  • Basics of programming, concepts of OOP
  • Basics of scripting language (such as PERL or RUBY)
  • Basics of operating Linux/Unix operating systems
  • Good understanding of JAVA programming language and Core Java
  • Understanding of SQL statements
  • Introduction to Big Data
  • Storing and Processing Data in Hadoop 2
  • Hive and Pig
  • Spark and Scala
  • Advanced Hadoop Tools – Oozie, ZooKeeper, Sqoop, Flume
  • Storm and NoSQL
  • Apache Drill and Impala