0% found this document useful (0 votes)
11 views3 pages

Tutorial Query and Visualize Data From A Notebook

This tutorial guides users on how to query and visualize data in an Azure Databricks notebook using SQL, Python, Scala, and R with sample data from Unity Catalog. It outlines the requirements for using Unity Catalog, creating a notebook, querying a table, and displaying data through visualizations. Additional resources for further learning on data import, querying, and exploratory data analysis are also provided.

Uploaded by

sdiesel211
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
11 views3 pages

Tutorial Query and Visualize Data From A Notebook

This tutorial guides users on how to query and visualize data in an Azure Databricks notebook using SQL, Python, Scala, and R with sample data from Unity Catalog. It outlines the requirements for using Unity Catalog, creating a notebook, querying a table, and displaying data through visualizations. Additional resources for further learning on data import, querying, and exploratory data analysis are also provided.

Uploaded by

sdiesel211
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 3

8/13/25, 1:50 AM Tutorial: Query and visualize data from a notebook - Azure Databricks | Microsoft Learn

Tutorial: Query and visualize data from a


notebook
05/08/2025

This get started article walks you through using an Azure Databricks notebook to query sample
data stored in Unity Catalog using SQL, Python, Scala, and R and then visualize the query results
in the notebook.

Requirements
To complete the tasks in this article, you must meet the following requirements:

Your workspace must have Unity Catalog enabled. For information on getting started with
Unity Catalog, see Get started with Unity Catalog.
You must have permission to use an existing compute resource or create a new compute
resource. See Get started tutorials on Azure Databricks or see your Databricks administrator.

Step 1: Create a new notebook


To create a notebook in your workspace, click New in the sidebar, and then click Notebook.
A blank notebook opens in the workspace.

To learn more about creating and managing notebooks, see Manage notebooks.

Step 2: Query a table


Query the samples.nyctaxi.trips table in Unity Catalog using the language of your choice.

1. Copy and paste the following code into the new empty notebook cell. This code displays the
results from querying the samples.nyctaxi.trips table in Unity Catalog.

SQL
SQL

https://learn.microsoft.com/en-us/azure/databricks/getting-started/quick-start 1/3
8/13/25, 1:50 AM Tutorial: Query and visualize data from a notebook - Azure Databricks | Microsoft Learn

SELECT * FROM samples.nyctaxi.trips

Python
Python

display(spark.read.table("samples.nyctaxi.trips"))

Scala
Scala

display(spark.read.table("samples.nyctaxi.trips"))

R
R

library(SparkR)
display(sql("SELECT * FROM samples.nyctaxi.trips"))

2. Press Shift+Enter to run the cell and then move to the next cell.

The query results appear in the notebook.

Step 3: Display the data


Display the average fare amount by trip distance, grouped by the pickup zip code.

1. Next to the Table tab, click + and then click Visualization.

The visualization editor displays.

2. In the Visualization Type drop-down, verify that Bar is selected.

3. Select fare_amount for the X column.

4. Select trip_distance for the Y column.


https://learn.microsoft.com/en-us/azure/databricks/getting-started/quick-start 2/3
8/13/25, 1:50 AM Tutorial: Query and visualize data from a notebook - Azure Databricks | Microsoft Learn

5. Select Average as the aggregation type.

6. Select pickup_zip as the Group by column.

7. Click Save.

Next steps
To learn about adding data from CSV file to Unity Catalog and visualize data, see Tutorial:
Import and visualize CSV data from a notebook.
To learn how to load data into Databricks using Apache Spark, see Tutorial: Load and
transform data using Apache Spark DataFrames.
To learn more about ingesting data into Databricks, see Standard connectors in Lakeflow
Connect.
To learn more about querying data with Databricks, see Query data.
To learn more about visualizations, see Visualizations in Databricks notebooks and SQL
editor.
To learn more about exploratory data analysis (EDA) techniques, see Tutorial: EDA techniques
using Databricks notebooks.

https://learn.microsoft.com/en-us/azure/databricks/getting-started/quick-start 3/3

You might also like