Azure Databricks Lakehouse: Ways to Load Data
By
Data Storyteller IT community
0 Followers
Follow
Event Details
Azure Databricks Lakehouse: Ways to Load Data
Link for registration https://datazen.top/1cdxs
In this presentation we will discuss the different ways to view and load data from external data sources, files and Azure storage into the Azure Databricks Lakehouse as well as how to monitor and troubleshoot the data loads. Amongst the options, we will discuss the Auto Loader, the COPY INTO method, ADF copy functionality and Optimized Spark. Also, the Unity Catalog will be discussed as well as loading data via the UI.
In this presentation we will discuss the different ways to view and load data from external data sources, files and Azure storage into the Azure Databricks Lakehouse as well as how to monitor and troubleshoot the data loads. Amongst the options, we will discuss the Auto Loader, the COPY INTO method, ADF copy functionality and Optimized Spark. Also, the Unity Catalog will be discussed as well as loading data via the UI.
Entry Fees
Free Registration
Categories
Event Frequency
One Time
Event Timings
(GMT+3:00) Kiev
06:00 PM - 07:00 PM (Jan 24) (General)
Organizer
Data Storyteller IT community
0 Followers
Follow
Epam Data Storyteller is a Data Platform initiative of the UA Data Analytics Engineering unit. This group unites IT professionals interested in Business Intelligence, Big Data, Clouds and Data Science. We are open to anyone who is interested in de...
Comments on Azure Databricks Lakehouse: Ways to Load Data
You must Login to write a comment.
Peoples Interested in Visit
0 Peoples Interested to Visit
Event Location
Official Link :