Skip to content

Conversation

MengqingCao
Copy link
Contributor

This PR adds weight name case for offset, thus making it compatible with some quantization tools that name zero-points as offsets, e.g., modelslim

Copy link

👋 Hi! Thank you for contributing to the vLLM project.

💬 Join our developer Slack at https://slack.vllm.ai to discuss your PR in #pr-reviews, coordinate on features in #feat- channels, or join special interest groups in #sig- channels.

Just a reminder: PRs would not trigger full CI run by default. Instead, it would only run fastcheck CI which starts running only a small and essential subset of CI tests to quickly catch errors. You can run other CI tests on top of those by going to your fastcheck build on Buildkite UI (linked in the PR checks section) and unblock them. If you do not have permission to unblock, ping simon-mo or khluu to add you in our Buildkite org.

Once the PR is approved and ready to go, your PR reviewer(s) can run CI to test the changes comprehensively before merging.

To run CI, PR reviewers can either: Add ready label to the PR or enable auto-merge.

🚀

@jeejeelee jeejeelee requested a review from mgoin March 26, 2025 03:03
Copy link
Member

@mgoin mgoin left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Are you going to contribute an integration for the modelslim format? This seems fine to land, but it is a bit strange to have this fix without actually having the format integrated upstream

@mgoin mgoin added the ready ONLY add when PR is ready to merge/full CI is needed label Mar 26, 2025
@MengqingCao
Copy link
Contributor Author

Are you going to contribute an integration for the modelslim format? This seems fine to land, but it is a bit strange to have this fix without actually having the format integrated upstream

Acctually this is a fix for downstream vllm-ascend, we are integrating modelslim into vllm-ascend through vllm-project/vllm-ascend#391. And modelslim is a quant tool for Ascend NPU, I think we'd better make smallest change in vLLM upstream, thus only adding offset in weight_loader.

@mgoin
Copy link
Member

mgoin commented Mar 27, 2025

I see, thanks for the context. LGTM then!

@mgoin mgoin enabled auto-merge (squash) March 27, 2025 02:06
@MengqingCao
Copy link
Contributor Author

I see, thanks for the context. LGTM then!

Thanks! 👍

@mgoin mgoin merged commit fb22be5 into vllm-project:main Mar 27, 2025
35 checks passed
@MengqingCao MengqingCao deleted the offset branch March 27, 2025 06:21
Alex4210987 pushed a commit to LeiWang1999/vllm-bitblas that referenced this pull request Apr 5, 2025
lulmer pushed a commit to lulmer/vllm that referenced this pull request Apr 7, 2025
lk-chen pushed a commit to lk-chen/vllm that referenced this pull request Apr 29, 2025
shreyankg pushed a commit to shreyankg/vllm that referenced this pull request May 3, 2025
RichardoMrMu pushed a commit to RichardoMrMu/vllm that referenced this pull request May 12, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

ready ONLY add when PR is ready to merge/full CI is needed

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants