How to Become Data Engineer for ETL Developers?

Durga Gadiraju
Data Engineering on Cloud

--

Are you a traditional ETL Developer and want to become Data Engineer but not sure how? Here is the 2 part series where you will learn the details about transitioning from traditional ETL Developer to Data Engineer on Cloud using AWS, Python, SQL, Spark, etc.

As part of these 2-part series of videos, we will cover how to become a Data Engineer if one is an experienced ETL or PL/SQL or Data Warehouse or Mainframes Developer. If you are an experienced Oracle PL/SQL Developer or an Informatica Developer or Talend Developer or Abinitio Developer or Microsoft SSIS/SSRS Developer or Data Stage Developer, then it is inevitable for you to transition to Data Engineer.

In these sessions most of your questions related to why and how you need to transition to Data engineering with examples based on our vast experience.

Here is the program link related to Data Engineering using AWS Analytics — https://itversity.com/bundle/data-engineering-using-aws-analytics

For sales inquiries: support@itversity.com

As this will be a very detailed session, we will cover all the below topics in 2 1.5-hour sessions. The links for the videos are after the agenda.

What is Data Engineering and why ETL, PL/SQL, Data Warehouse, and Mainframes Developers should take it seriously?

  • Conventional Data Warehousing + Modern Analytics
  • Data Engineering on Cloud Platforms — AWS, GCP, Azure, Databricks, Snowflake, CDP, etc

What are all the different systems Data Engineer deal with?

  • Variety of source or upstream systems — Purpose Built Databases, Files, REST APIs
  • Data Lake
  • Downstream systems such as Data Warehouses or MPP, NoSQL, External Systems

What are the key skills and up to what level ETL, PL/SQL, Data Warehouse, and Mainframes Developers should know the skills?

  • REST APIs and JSON with Demo
  • SQL with Demo
  • Orchestration with example or demo
  • Python with demo
  • Key Integrations with demo
  • Cloud and Serverless with demo
  • Performance Tuning with demo

Details about our Guided Program on AWS (others in the future). Here are some of the highlights of the program.

  • Python and SQL
  • AWS Essentials for Data Engineers for Data Lake, Distributed Compute, Data Warehouse, and other purpose-built services
  • Mastering AWS Lambda Functions for Data Engineers — to build or enhance data pipelines
  • Mastering AWS Elastic Map Reduce for Data Engineers — to build data pipelines to process large-scale data using distributed computing
  • Mastering AWS Redshift as Data Warehouse for Data Engineers — to build Data Marts or Data Warehouse to serve enterprise reports or dashboards
  • Mastering AWS Athena and Glue Data Catalog — for ad-hoc analysis of Data as well as to build data pipelines for large-scale data
  • Mastering Amazon Managed Streaming for Apache Kafka (MSK) — to build streaming data pipelines integrating with Spark and other purpose-built AWS Services
  • Performance Tuning Guide for Data Engineers on AWS — Data Ingestion, Data Lake, Data Processing, Loading Data

Other Details related to the course

  • Cost and Timelines for the course
  • Delivery Mode (Hybrid) — Self-Paced with continuous support
  • Labs and Additional Costs
  • Refund Policy
  • Placement Assistance or Support
  • Alumni Club

If you are interested to understand more about the path to become data engineer from roles such as ETL Developer, PL/SQL Developer, Data Warehouse Developer, Mainframes Developer, etc — please go through the details by watching videos related to the webinars or meetups.

Here is the article for the part 1 session of the 2 part series on Roadmap to Become a Data Engineer for ETL, PL/SQL, Data Warehouse and Mainframes Developers

Here is the article for the part 2 session of the 2 part series on Roadmap to Become a Data Engineer for ETL, PL/SQL, Data Warehouse and Mainframes Developers

--

--