Skip to content

Bump mlflow from 2.13.2 to 2.16.0 in /src #2

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 1 commit into
base: main
Choose a base branch
from

Conversation

dependabot[bot]
Copy link

@dependabot dependabot bot commented on behalf of github Nov 25, 2024

Bumps mlflow from 2.13.2 to 2.16.0.

Release notes

Sourced from mlflow's releases.

MLflow 2.16.0

We are excited to announce the release of MLflow 2.16.0. This release includes many major features and improvements!

Major features:

  • LlamaIndex Enhancements🦙 - to provide additional flexibility to the LlamaIndex integration, we now have support for the models-from-code functionality for logging, extended engine-based logging, and broadened support for external vector stores.

  • LangGraph Support - We've expanded the LangChain integration to support the agent framework LangGraph. With tracing and support for logging using the models-from-code feature, creating and storing agent applications has never been easier!

  • AutoGen Tracing - Full automatic support for tracing multi-turn agent applications built with Microsoft's AutoGen framework is now available in MLflow. Enabling autologging via mlflow.autogen.autolog() will instrument your agents built with AutoGen.

  • Plugin support for AI Gateway - You can now define your own provider interfaces that will work with MLflow's AI Gateway (also known as the MLflow Deployments Server). Creating an installable provider definition will allow you to connect the Gateway server to any GenAI service of your choosing.

Features:

  • [UI] Add updated deployment usage examples to the MLflow artifact viewer (#13024, @​serena-ruan, @​daniellok-db)
  • [Models] Support logging LangGraph applications via the models-from-code feature (#12996, @​B-Step62)
  • [Models] Extend automatic authorization pass-through support for Langgraph agents (#13001, @​aravind-segu)
  • [Models] Expand the support for LangChain application logging to include UCFunctionToolkit dependencies (#12966, @​aravind-segu)
  • [Models] Support saving LlamaIndex engine directly via the models-from-code feature (#12978, @​B-Step62)
  • [Models] Support models-from-code within the LlamaIndex flavor (#12944, @​B-Step62)
  • [Models] Remove the data structure conversion of input examples to ensure enhanced compatibility with inference signatures (#12782, @​serena-ruan)
  • [Models] Add the ability to retrieve the underlying model object from within pyfunc model wrappers (#12814, @​serena-ruan)
  • [Models] Add spark vector UDT type support for model signatures (#12758, @​WeichenXu123)
  • [Tracing] Add tracing support for AutoGen (#12913, @​B-Step62)
  • [Tracing] Reduce the latency overhead for tracing (#12885, @​B-Step62)
  • [Tracing] Add Async support for the trace decorator (#12877, @​MPKonst)
  • [Deployments] Introduce a plugin provider system to the AI Gateway (Deployments Server) (#12611, @​gabrielfu)
  • [Projects] Add support for parameter submission to MLflow Projects run in Databricks (#12854, @​WeichenXu123)
  • [Model Registry] Introduce support for Open Source Unity Catalog as a model registry service (#12888, @​artjen)

Bug fixes:

Documentation updates:

... (truncated)

Changelog

Sourced from mlflow's changelog.

2.16.0 (2024-08-30)

We are excited to announce the release of MLflow 2.16.0. This release includes many major features and improvements!

Major features:

  • LlamaIndex Enhancements🦙 - to provide additional flexibility to the LlamaIndex integration, we now have support for the models-from-code functionality for logging, extended engine-based logging, and broadened support for external vector stores.

  • LangGraph Support - We've expanded the LangChain integration to support the agent framework LangGraph. With tracing and support for logging using the models-from-code feature, creating and storing agent applications has never been easier!

  • AutoGen Tracing - Full automatic support for tracing multi-turn agent applications built with Microsoft's AutoGen framework is now available in MLflow. Enabling autologging via mlflow.autogen.autolog() will instrument your agents built with AutoGen.

  • Plugin support for AI Gateway - You can now define your own provider interfaces that will work with MLflow's AI Gateway (also known as the MLflow Deployments Server). Creating an installable provider definition will allow you to connect the Gateway server to any GenAI service of your choosing.

Features:

  • [UI] Add updated deployment usage examples to the MLflow artifact viewer (#13024, @​serena-ruan, @​daniellok-db)
  • [Models] Support logging LangGraph applications via the models-from-code feature (#12996, @​B-Step62)
  • [Models] Extend automatic authorization pass-through support for Langgraph agents (#13001, @​aravind-segu)
  • [Models] Expand the support for LangChain application logging to include UCFunctionToolkit dependencies (#12966, @​aravind-segu)
  • [Models] Support saving LlamaIndex engine directly via the models-from-code feature (#12978, @​B-Step62)
  • [Models] Support models-from-code within the LlamaIndex flavor (#12944, @​B-Step62)
  • [Models] Remove the data structure conversion of input examples to ensure enhanced compatibility with inference signatures (#12782, @​serena-ruan)
  • [Models] Add the ability to retrieve the underlying model object from within pyfunc model wrappers (#12814, @​serena-ruan)
  • [Models] Add spark vector UDT type support for model signatures (#12758, @​WeichenXu123)
  • [Tracing] Add tracing support for AutoGen (#12913, @​B-Step62)
  • [Tracing] Reduce the latency overhead for tracing (#12885, @​B-Step62)
  • [Tracing] Add Async support for the trace decorator (#12877, @​MPKonst)
  • [Deployments] Introduce a plugin provider system to the AI Gateway (Deployments Server) (#12611, @​gabrielfu)
  • [Projects] Add support for parameter submission to MLflow Projects run in Databricks (#12854, @​WeichenXu123)
  • [Model Registry] Introduce support for Open Source Unity Catalog as a model registry service (#12888, @​artjen)

Bug fixes:

Documentation updates:

... (truncated)

Commits

Dependabot compatibility score

Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


Dependabot commands and options

You can trigger Dependabot actions by commenting on this PR:

  • @dependabot rebase will rebase this PR
  • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
  • @dependabot merge will merge this PR after your CI passes on it
  • @dependabot squash and merge will squash and merge this PR after your CI passes on it
  • @dependabot cancel merge will cancel a previously requested merge and block automerging
  • @dependabot reopen will reopen this PR if it is closed
  • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
  • @dependabot show <dependency name> ignore conditions will show all of the ignore conditions of the specified dependency
  • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
  • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
  • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
    You can disable automated security fix PRs for this repo from the Security Alerts page.

Summary by Sourcery

Bump mlflow from version 2.13.2 to 2.16.0, introducing major features such as LlamaIndex enhancements, LangGraph support, AutoGen tracing, and AI Gateway plugin support. The update also includes several bug fixes and documentation improvements.

New Features:

  • Introduce LlamaIndex enhancements, including models-from-code functionality, extended engine-based logging, and support for external vector stores.
  • Expand LangChain integration to support LangGraph with tracing and models-from-code logging.
  • Add full automatic support for tracing multi-turn agent applications built with Microsoft's AutoGen framework.
  • Introduce plugin support for AI Gateway, allowing custom provider interfaces for GenAI services.

Bug Fixes:

  • Reduce contents of the model-history tag to essential fields.
  • Fix behavior of defining the device for loading transformers models.
  • Fix evaluate behavior for LlamaIndex.
  • Replace pkg_resources with importlib.metadata due to deprecation.
  • Fix error handling for OpenAI autolog tracing.
  • Resolve deadlock condition when connecting to an SFTP artifact store.
  • Fix initialization of code_paths dependencies for LangChain models.
  • Fix type error for metrics value logging.
  • Properly catch NVML errors when collecting GPU metrics.
  • Improve Gateway schema support for OpenAI provider.
  • Fix deletion of artifacts during UC model registration from non-standard DBFS locations.

Documentation:

  • Add documentation guides for LangGraph support.
  • Add additional documentation for models-from-code feature.

Bumps [mlflow](https://github.com/mlflow/mlflow) from 2.13.2 to 2.16.0.
- [Release notes](https://github.com/mlflow/mlflow/releases)
- [Changelog](https://github.com/mlflow/mlflow/blob/master/CHANGELOG.md)
- [Commits](mlflow/mlflow@v2.13.2...v2.16.0)

---
updated-dependencies:
- dependency-name: mlflow
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <[email protected]>
@dependabot dependabot bot added the dependencies Pull requests that update a dependency file label Nov 25, 2024
Copy link

sourcery-ai bot commented Nov 25, 2024

Reviewer's Guide by Sourcery

This PR updates the MLflow dependency from version 2.13.2 to 2.16.0. The new version introduces several major features including enhanced LlamaIndex integration, LangGraph support, AutoGen tracing capabilities, and plugin support for AI Gateway. The update also includes various bug fixes and improvements across different MLflow components.

No diagrams generated as the changes look simple and do not need a visual representation.

File-Level Changes

Change Details Files
Major feature additions and enhancements to MLflow's LLM capabilities
  • Added support for LlamaIndex models-from-code functionality and extended engine-based logging
  • Introduced LangGraph support for the LangChain integration
  • Added AutoGen tracing support for multi-turn agent applications
  • Implemented plugin support system for AI Gateway
src/requirements.freeze.txt
Improvements to model handling and tracking functionality
  • Added ability to retrieve underlying model object from pyfunc wrappers
  • Added support for spark vector UDT type in model signatures
  • Improved compatibility with inference signatures
  • Reduced tracing latency overhead
src/requirements.freeze.txt
Bug fixes and stability improvements
  • Fixed device utilization behavior for transformers models
  • Fixed deadlock issues with SFTP artifact store connections
  • Improved error handling for OpenAI autolog tracing
  • Fixed type errors in metrics value logging
src/requirements.freeze.txt

Tips and commands

Interacting with Sourcery

  • Trigger a new review: Comment @sourcery-ai review on the pull request.
  • Continue discussions: Reply directly to Sourcery's review comments.
  • Generate a GitHub issue from a review comment: Ask Sourcery to create an
    issue from a review comment by replying to it.
  • Generate a pull request title: Write @sourcery-ai anywhere in the pull
    request title to generate a title at any time.
  • Generate a pull request summary: Write @sourcery-ai summary anywhere in
    the pull request body to generate a PR summary at any time. You can also use
    this command to specify where the summary should be inserted.

Customizing Your Experience

Access your dashboard to:

  • Enable or disable review features such as the Sourcery-generated pull request
    summary, the reviewer's guide, and others.
  • Change the review language.
  • Add, remove or edit custom review instructions.
  • Adjust other review settings.

Getting Help

Copy link

@sourcery-ai sourcery-ai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We have skipped reviewing this pull request. It seems to have been created by a bot (hey, dependabot[bot]!). We assume it knows what it's doing!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
dependencies Pull requests that update a dependency file
Projects
None yet
Development

Successfully merging this pull request may close these issues.

0 participants