
Workshops for Professionals:
Lakehouse DataOps with Databricks
From development to production
Goal
Embarking on the journey from a proof of concept to production-ready pipelines can be a daunting task. To navigate this transition effectively, it is crucial to understand the key steps involved in productionizing your data platform.
In this highly informative masterclass, we will guide you through what it takes to build a scalable Data Platform, and navigate the intricacies of seamlessly transitioning your pipelines from development to production utilizing the Lakehouse Architecture on Databricks.
Details
This course will show you what it takes to build a scalable data platform in four parts:
1) Introduction to databricks and the lakehouse architecture. Including Unity Catalog and Delta Lake.
2) The five main parts of a scalable data platform-as-a-service.
3) How to develop data products using databricks dbx, workflows & pyspark.
4) How to productionize your code & data pipelines using git pipelines.
Target Audience
Target Audience: Data Engineers, DevOps Engineers, Data Scientists
Prerequisites: Basic Python, Basic Linux, Basic Git

Tailor-Made
We offer tailor-made workshops
for your teams, to your specifications.
Contact us via +41792018191 or academy@d-one.ai
-
-
Robert Yousif
Robert holds a B.Sc. in Mechanical Engineering and a M.Sc. in Mechatronics from KTH Royal Institute of Technology in Stockholm. Before joining D ONE, he was working as a data & software engineer for KPMG in Stockholm. Robert has been with the team since 2021.