-
Notifications
You must be signed in to change notification settings - Fork 10.8k
New cookbook - Supply-Chain Copilot with OpenAI Agent SDK and Databricks MCP Servers #1935
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
"- Classical forecasting models for fine-grained demand and lead-time predictions \n", | ||
"- Graph based route optimizations \n", | ||
"- Vector-indexed e-mail archives that enable semantic search across unstructured communications for delay signals\n", | ||
"- Generative-AI workflows that surface insights and mitigation options in plain language\n", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
nit: 'Generative AI'
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks, I've updated it now
"\n", | ||
"The easiest way is to add a profile to `~/.databrickscfg`. The snippet’s `WorkspaceClient(profile=...)` call will pick that up. It tells the SDK which of those stored credentials to load, so your code never needs to embed tokens. Another option would be to create environment variables such as `DATABRICKS_HOST` and `DATABRICKS_TOKEN`, but using `~/.databrickscfg` is recommended.\n", | ||
"\n", | ||
"`pip install openai databricks-sdk`" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I would have this runnable in a separate cell:
! pip install openai databricks-sdk
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Done, thanks
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think there are quite a lot of other dependencies I would add here too getting further along:
! pip install openai databricks-sdk databricks-cli databricks-mcp openai-agents
Summary
This walkthrough demonstrates how to build a supply‑chain copilot using the OpenAI Agent SDK and Databricks Managed MCP, highlighting Tracing, Guardrails, and seamless integration with enterprise data sources. It shows how an agent can blend structured inventory tables, forecasting models, graph‑based BOM logic, vector‑indexed e‑mails behind a single conversational interface, then surface the results through FastAPI and a React chat UI. Readers learn to authenticate against Databricks, expose Unity Catalog functions and Vector Search (Databricks vector index) as MCP tools, trace every call in the OpenAI dashboard, and enforce guardrails, all while keeping the underlying data governance intact.
Motivation
We currently don’t have a cookbook that explains how to integrate with Databricks data sources. We lack an example that shows how to connect an Agent to Databricks using Databricks‑managed MCP servers. This guide remedies that by delivering an agent capable of tackling supply‑chain questions such as “Can we meet demand?” and “How much revenue is at risk if we can’t produce the forecasted amount of a certain product?”.
For new content
When contributing new content, read through our contribution guidelines, and mark the following action items as completed:
We will rate each of these areas on a scale from 1 to 4, and will only accept contributions that score 3 or higher on all areas. Refer to our contribution guidelines for more details.