0% found this document useful (0 votes)
20 views2 pages

Get Started Tutorials On Azure Databricks

The document provides a series of tutorials for getting started with Azure Databricks, covering core features and basic functionalities. Key topics include querying and visualizing data, importing CSV data, creating tables, building ETL pipelines, training machine learning models, and querying large language models. Additional resources for online training and support are also mentioned.

Uploaded by

sdiesel211
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
20 views2 pages

Get Started Tutorials On Azure Databricks

The document provides a series of tutorials for getting started with Azure Databricks, covering core features and basic functionalities. Key topics include querying and visualizing data, importing CSV data, creating tables, building ETL pipelines, training machine learning models, and querying large language models. Additional resources for online training and support are also mentioned.

Uploaded by

sdiesel211
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 2

8/13/25, 1:50 AM Get started tutorials on Azure Databricks - Azure Databricks | Microsoft Learn

Get started tutorials on Azure Databricks


06/11/2025

The tutorials in this section introduce core features and guide you through the basics of working
with the Azure Databricks platform.

For information about online training resources, see Get free Databricks training.

If you do not have a Azure Databricks account, sign up for a free trial.

ノ Expand table

Tutorial Description

Query and visualize Use a Databricks notebook to query sample data stored in Unity Catalog using SQL,
data Python, Scala, and R, and then visualize the query results in the notebook.

Import and visualize Use a Databricks notebook to import data from a CSV file containing baby name
CSV data from a data from https://health.data.ny.gov into your Unity Catalog volume using Python,
notebook Scala, and R. You also learn to modify a column name, visualize the data, and save to
a table.

Create a table Create a table and grant privileges in Databricks using the Unity Catalog data
governance model.

Build an ETL pipeline Create and deploy an ETL (extract, transform, and load) pipeline for data
using Lakeflow orchestration using Lakeflow Declarative Pipelines and Auto Loader.
Declarative Pipelines

Build an ETL pipeline Develop and deploy your first ETL (extract, transform, and load) pipeline for data
using Apache Spark orchestration with Apache Spark™.

Train and deploy an Build a machine learning classification model using the scikit-learn library on
ML model Databricks to predict whether a wine is considered “high-quality”. This tutorial also
illustrates the use of MLflow to track the model development process, and Hyperopt
to automate hyperparameter tuning.

Query LLMs and Use the AI Playground to query large language models (LLMs) and compare results
prototype AI agents side-by-side, prototype a tool-calling AI agent, and export your agent to code.
with no-code

ノ Expand table

https://learn.microsoft.com/en-us/azure/databricks/getting-started/ 1/2
8/13/25, 1:50 AM Get started tutorials on Azure Databricks - Azure Databricks | Microsoft Learn

Tutorial Details

Query and visualize Use a Databricks notebook to query sample data stored in Unity Catalog using SQL,
data Python, Scala, and R, and then visualize the query results in the notebook.

Import and visualize Use a Databricks notebook to import data from a CSV file containing baby name
CSV data from a data from https://health.data.ny.gov into your Unity Catalog volume using Python,
notebook Scala, and R. You also learn to modify a column name, visualize the data, and save to
a table.

Create a table Create a table and grant privileges in Databricks using the Unity Catalog data
governance model.

Build an ETL pipeline Create and deploy an ETL (extract, transform, and load) pipeline for data
using Lakeflow orchestration using Lakeflow Declarative Pipelines and Auto Loader.
Declarative Pipelines

Build an ETL pipeline Develop and deploy your first ETL (extract, transform, and load) pipeline for data
using Apache Spark orchestration with Apache Spark™.

Train and deploy an Build a machine learning classification model using the scikit-learn library on
ML model Databricks to predict whether a wine is considered “high-quality”. This tutorial also
illustrates the use of MLflow to track the model development process, and Hyperopt
to automate hyperparameter tuning.

Query LLMs and Use the AI Playground to query large language models (LLMs) and compare results
prototype AI agents side-by-side, prototype a tool-calling AI agent, and export your agent to code.
with no-code

Connect to Azure Connect from Azure Databricks to Azure Data Lake Storage using OAuth 2.0 with a
Data Lake Storage Microsoft Entra ID service principal.

Get help
If you have any questions about setting up Azure Databricks and need live help, please e-
mail [email protected].

If your organization does not have a Azure Databricks support subscription, or if you are not
an authorized contact for your company's support subscription, you can get answers from
the Databricks Community .

https://learn.microsoft.com/en-us/azure/databricks/getting-started/ 2/2

You might also like