-
Notifications
You must be signed in to change notification settings - Fork 334
Fix deprecated torch_dtype usage in transformers loading #2109
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fix deprecated torch_dtype usage in transformers loading #2109
Conversation
|
👋 Hi! Thank you for contributing to llm-compressor. Please add the ready label when the PR is ready for review. Note: This is required to complete the testing suite, please only add the label once the PR is code complete and local testing has been performed. |
Summary of ChangesHello @jangel97, I'm Gemini Code Assist1! I'm currently reviewing this pull request and will post my feedback shortly. In the meantime, here's a summary to help you and other reviewers quickly get up to speed! This pull request modernizes the project's interaction with the Hugging Face Highlights
Using Gemini Code AssistThe full guide for Gemini Code Assist can be found on our documentation page, here are some quick tips. Invoking Gemini You can request assistance from Gemini at any point by creating a comment using either
Customization To customize Gemini Code Assist for GitHub experience, repository maintainers can create a configuration file and/or provide a custom code review style guide (such as PEP-8 for Python) by creating and adding files to a Limitations & Feedback Gemini Code Assist may make mistakes. Please leave feedback on any instances where its feedback is incorrect or counter productive. You can react with 👍 and 👎 on @gemini-code-assist comments. If you're interested in giving your feedback about your experience with Gemini Code Assist for Github and other Google products, sign up here. You can also get AI-powered code generation, chat, as well as code reviews directly in the IDE at no cost with the Gemini Code Assist IDE Extension. Footnotes
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Code Review
This pull request effectively addresses the deprecation warning related to torch_dtype by replacing it with dtype in transformers.from_pretrained() calls across the codebase. The setup.py file has been appropriately updated to reflect the new minimum transformers version (4.56.1), which is necessary for the dtype argument to be supported. The changes are clear, concise, and directly resolve the stated objective of aligning with current transformers APIs and eliminating the deprecation warning. No further issues were identified in the reviewed changes.
kylesayrs
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
|
Looks like a reasonable change. |
|
Thanks for implementing the fix @jangel97 ! We also have tests and examples etc using the ./tests/e2e/e2e_utils.py Can you please do a find and replace for all the files that's using |
a089bfb to
6cb1ec0
Compare
|
@dhuangnm good point, I just did! |
dhuangnm
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks, LGTM!
brian-dellabetta
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
thanks for the contribution! this should be fine to merge after the release
5c5a69a
6cb1ec0 to
5c5a69a
Compare
|
@dhuangnm @brian-dellabetta I rebased on main and had to force-push the branch. |
brian-dellabetta
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks @jangel97 , the changes look good, just calling out that we need to bump transformers versions to above where the deprecation was made.
Team do we want to make this change now that the release has occurred?
Replace deprecated torch_dtype parameter with dtype in transformers from_pretrained calls. Bump minimum transformers version to 4.56.1 where dtype parameter was introduced. Signed-off-by: Jose Angel Morena <jmorenas@redhat.com>
5c5a69a to
a957fe2
Compare
|
Hey @jangel97 do you mind fixing the quality issue so that we can land this? pip install -e .[dev]
make style
make quality |
2767601 to
d91b3bd
Compare
|
@dsikka @brian-dellabetta, sorry I was out. I just ran those commands in my local, ran git commit and force push. Please, let me know if something else would be needed. 19 files reformatted. Thanks for your help! |
|
Hi @jangel97 , the changes look good but tests are failing for an unrelated reason when interacting with the huggingface hub. I wonder if a recent PR related to the HF_TOKEN is causing community user PRs to fail with 401 gated repo errors: Will try to get this in in the new year, thanks for the contribution! |
While running
llmcompressor, I saw the following warning:This PR replaces the deprecated
torch_dtypeparameter withdtypeintransformers.from_pretrained()calls.Since support for the
dtypeargument was introduced intransformersv4.56.1, this change also bumps the minimum supportedtransformersversion accordingly.Transformers ≥4.56.1 requires Python ≥3.9; however,
llmcompressoralready requires Python ≥3.10, so this change does not reduce Python compatibility for existing users.The goal of this PR is to eliminate the deprecation warning and align with current
transformersAPIs.Related discussion: vllm-project/vllm#26293