Prophecy Data Transformation Copilot For Data Engineering

Currently reading:
 Prophecy Data Transformation Copilot For Data Engineering

mayoufi

Member
Amateur
LV
5
Joined
Oct 22, 2023
Threads
3,280
Likes
291
Awards
11
Credits
357©
Cash
0$
19605efc8ae49ce05e1a0be66ede9482.jpg

Prophecy Data Transformation Copilot For Data Engineering
Last updated 5/2024
MP4 | Video: h264, 1280x720 | Audio: AAC, 44.1 KHz
Language: English (US) | Size: 1.49 GB | Duration: 5h 19m


Learn Databricks and Spark data engineering to deliver self-service data transformation and speed pipeline development

What you'll learn
Learn and design the data lakehouse paradigm for a e-commerce company
Hands-on lab environment is provided with this course
Implement and deploy a medallion architecture using Prophecy running on Databricks
Understand Apache Spark and its best practices with real-life use cases
Share and extend Pipeline components with data practitioners and analysts
Deploy Pipelines to production and CI/CD and best practices
Utilize version control and change management in data engineering
Deploy data quality checks and unit tests

Requirements
No programming experience needed. You will utilize low-code UI to build a real-life data implementation

Description
This course is designed to help data engineers and analysts to build and deploy a cloud data lakehouse architectu using Prophecy's Data Transformation Copilot. It is created with the intention of helping you embark on your data engineering journey with Spark and Prophecy.We will start by staging the ingested data from application platforms like Salesforce, operational databases with CDC transactional data, and machine generated data like logs and metrics. We're going to clean and normalize the ingested tables to prepare a complete, clean, and efficient data model. From that data model, we're going to build four projects creating consumption applications for different real-world use-cases. With each of the projects, you're going to learn something new:We will build a spreadsheet export for your finance department, where we will explore data modeling and transformation concepts. Since the finance department really cares about the quality of data, we're going to also learn about how to setup unit and integration tests to maintain high quality.We will create an alerting system for your operational support team to ensure customer success, where we're going to learn about orchestration best practices.Sales data upload that can be ingested back to Salesforce, where we will explore advanced extensibility concepts that will allows us to create and follow standardized practices.A dashboard directly on Databricks for your product team to monitor live usage. Here we we learn the a lot about observability and data quality.The best part? All of the code that will be building is completely open-source and accessible. You will be able to apply everything your learn here in your real projects.Our entire team of best in-class data engineers and architects with tons of experience from companies like Salesforce, Databricks, and Instagram are going to walk you through, step by step, building out these use-cases.
 

Create an account or login to comment

You must be a member in order to leave a comment

Create account

Create an account on our community. It's easy!

Log in

Already have an account? Log in here.

Tips

Similar threads

Top Bottom