Skip to content

Conversation

fpaupier
Copy link
Contributor

@fpaupier fpaupier commented May 20, 2025

This PR introduce the possibility to define a custom logging configuration for the uvicorn server used to expose the OpenAI API compliant API.
Previously it was possible to modify the log format of the vLLM engine but specifying a config file had no effect over the uvicorn API.
Hence you always saw something like this uvicorn log format when starting your app (log below)

INFO:     Started server process [128887]
INFO:     Waiting for application startup.
INFO:     Application startup complete.

This is limitating for use in production systems that have auditing requirement and must expose logs in a specific format to ensure compliance or just to make it easier to parse logs from various systems.

This PR extend the custom logging possibility defined in https://docs.vllm.ai/en/latest/getting_started/examples/logging_configuration.html to enable custom uvicorn logging format.

FIX #18210


You can test it yourself with the provided logger config below and making sure to expose it to the vLLM process by doing
export VLLM_LOGGING_CONFIG_PATH=./logger-config.json vllm serve....

{
  "version": 1,
  "disable_existing_loggers": false,
  "formatters": {
    "plain": {
      "class": "logging.Formatter",
      "format": "%(asctime)s [%(levelname)s] %(filename)s:%(lineno)d - %(funcName)s() - %(message)s",
      "datefmt": "%d-%m-%Y %H:%M:%S"
    },
    "default": {
      "()": "uvicorn.logging.DefaultFormatter",
      "format": "%(asctime)s [%(levelname)s] %(filename)s:%(lineno)d - %(funcName)s() - %(message)s",
      "datefmt": "%d-%m-%Y %H:%M:%S"
    },
    "access": {
      "()": "uvicorn.logging.AccessFormatter",
      "format": "%(asctime)s [%(levelname)s] %(client_addr)s - \"%(request_line)s\" %(status_code)s",
      "datefmt": "%d-%m-%Y %H:%M:%S"
    }
  },
  "handlers": {
    "console": {
      "class": "logging.StreamHandler",
      "formatter": "plain",
      "level": "INFO",
      "stream": "ext://sys.stdout"
    },
    "default": {
      "class": "logging.StreamHandler",
      "formatter": "default",
      "stream": "ext://sys.stderr"
    },
    "access": {
      "class": "logging.StreamHandler",
      "formatter": "access",
      "stream": "ext://sys.stdout"
    }
  },
  "loggers": {
    "vllm": {
      "handlers": ["console"],
      "level": "INFO",
      "propagate": false
    },
    "vllm.entrypoints.api_server": {
      "handlers": ["console"],
      "level": "INFO",
      "propagate": false
    },
    "uvicorn": {
      "handlers": ["default"],
      "level": "INFO",
      "propagate": false
    },
    "uvicorn.error": {
      "level": "INFO"
    },
    "uvicorn.access": {
      "handlers": ["access"],
      "level": "INFO",
      "propagate": false
    }
  }
}

You will see the following log at service startup for the uvicorn server:

20-05-2025 10:11:23 [INFO] server.py:83 - _serve() - Started server process [129561]
20-05-2025 10:11:23 [INFO] on.py:48 - startup() - Waiting for application startup.
20-05-2025 10:11:23 [INFO] on.py:62 - startup() - Application startup complete.

Copy link

👋 Hi! Thank you for contributing to the vLLM project.

💬 Join our developer Slack at https://slack.vllm.ai to discuss your PR in #pr-reviews, coordinate on features in #feat- channels, or join special interest groups in #sig- channels.

Just a reminder: PRs would not trigger full CI run by default. Instead, it would only run fastcheck CI which starts running only a small and essential subset of CI tests to quickly catch errors. You can run other CI tests on top of those by going to your fastcheck build on Buildkite UI (linked in the PR checks section) and unblock them. If you do not have permission to unblock, ping simon-mo or khluu to add you in our Buildkite org.

Once the PR is approved and ready to go, your PR reviewer(s) can run CI to test the changes comprehensively before merging.

To run CI, PR reviewers can either: Add ready label to the PR or enable auto-merge.

🚀

@fpaupier
Copy link
Contributor Author

Hello @russellb, let me know if you need additional context on this PR or if you have remarks to address to make it green for merge

@fpaupier
Copy link
Contributor Author

Let me know @DarkLight1337 if I can add anything to ease the review to make sure this does not diverge from main and can be merged soon.

@fpaupier
Copy link
Contributor Author

fpaupier commented Jun 2, 2025

hi @DarkLight1337 - I re-ran the CI and resolved the conflict with main to have a green PR.
→ Could we proceed with a merge or a review to limit diverging from main?
It's a small PR and very useful for our prod systems. I'd be happy to address any concerns you have with this PR

Comment on lines 1327 to 1341
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Move this to a separate function so we can inline the log_config argument into the existing code at L1336

@fpaupier fpaupier force-pushed the main branch 3 times, most recently from 12a0fc6 to 8f86dd9 Compare June 2, 2025 08:47
ssl_cert_reqs=args.ssl_cert_reqs,
**uvicorn_kwargs,
)

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Unnecessary change

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Fixed 👌

Copy link
Member

@DarkLight1337 DarkLight1337 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM, sorry for the delay!

@DarkLight1337 DarkLight1337 enabled auto-merge (squash) June 2, 2025 10:01
@github-actions github-actions bot added the ready ONLY add when PR is ready to merge/full CI is needed label Jun 2, 2025
@DarkLight1337 DarkLight1337 merged commit 20133cf into vllm-project:main Jun 2, 2025
67 of 69 checks passed
mmontuori pushed a commit to mmontuori/vllm that referenced this pull request Jun 5, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

frontend ready ONLY add when PR is ready to merge/full CI is needed

Projects

None yet

Development

Successfully merging this pull request may close these issues.

[Usage]: Unable to customize FastAPI/Uvicorn log format for OpenAI compatible server

2 participants