Skip to content

New cookbook - Supply-Chain Copilot with OpenAI Agent SDK and Databricks MCP Servers #1935

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 10 commits into
base: main
Choose a base branch
from

Conversation

lara-openai
Copy link

Summary

This walkthrough demonstrates how to build a supply‑chain copilot using the OpenAI Agent SDK and Databricks Managed MCP, highlighting Tracing, Guardrails, and seamless integration with enterprise data sources. It shows how an agent can blend structured inventory tables, forecasting models, graph‑based BOM logic, vector‑indexed e‑mails behind a single conversational interface, then surface the results through FastAPI and a React chat UI. Readers learn to authenticate against Databricks, expose Unity Catalog functions and Vector Search (Databricks vector index) as MCP tools, trace every call in the OpenAI dashboard, and enforce guardrails, all while keeping the underlying data governance intact.

Motivation

We currently don’t have a cookbook that explains how to integrate with Databricks data sources. We lack an example that shows how to connect an Agent to Databricks using Databricks‑managed MCP servers. This guide remedies that by delivering an agent capable of tackling supply‑chain questions such as “Can we meet demand?” and “How much revenue is at risk if we can’t produce the forecasted amount of a certain product?”.


For new content

When contributing new content, read through our contribution guidelines, and mark the following action items as completed:

  • I have added a new entry in registry.yaml (and, optionally, in authors.yaml) so that my content renders on the cookbook website.
  • I have conducted a self-review of my content based on the contribution guidelines:
    • Relevance: This content is related to building with OpenAI technologies and is useful to others.
    • Uniqueness: I have searched for related examples in the OpenAI Cookbook, and verified that my content offers new insights or unique information compared to existing documentation.
    • Spelling and Grammar: I have checked for spelling or grammatical mistakes.
    • Clarity: I have done a final read-through and verified that my submission is well-organized and easy to understand.
    • Correctness: The information I include is correct and all of my code executes successfully.
    • Completeness: I have explained everything fully, including all necessary references and citations.

We will rate each of these areas on a scale from 1 to 4, and will only accept contributions that score 3 or higher on all areas. Refer to our contribution guidelines for more details.

@lara-openai lara-openai marked this pull request as ready for review July 7, 2025 09:18
"- Classical forecasting models for fine-grained demand and lead-time predictions \n",
"- Graph based route optimizations \n",
"- Vector-indexed e-mail archives that enable semantic search across unstructured communications for delay signals\n",
"- Generative-AI workflows that surface insights and mitigation options in plain language\n",
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nit: 'Generative AI'

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks, I've updated it now

"\n",
"The easiest way is to add a profile to `~/.databrickscfg`. The snippet’s `WorkspaceClient(profile=...)` call will pick that up. It tells the SDK which of those stored credentials to load, so your code never needs to embed tokens. Another option would be to create environment variables such as `DATABRICKS_HOST` and `DATABRICKS_TOKEN`, but using `~/.databrickscfg` is recommended.\n",
"\n",
"`pip install openai databricks-sdk`"
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I would have this runnable in a separate cell:

! pip install openai databricks-sdk

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done, thanks

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think there are quite a lot of other dependencies I would add here too getting further along:

! pip install openai databricks-sdk databricks-cli databricks-mcp openai-agents

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants