Skip to content

Conversation

@BenjaminBossan
Copy link
Member

@BenjaminBossan BenjaminBossan commented Dec 5, 2025

What does this PR do?

This PR fixes a bug that prevented non-LoRA PEFT adapters to be loaded into a transformers model. It was just a very simple logical error where a check was performed generally when it actually only is relevant for hotswapping. A test for this has been added.

Additionally, also testing if a non-LoRA adapter can be added to a transformers model. This was not broken but still lacked test coverage.

Before submitting

  • This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
  • Did you read the contributor guideline,
    Pull Request section?
  • Was this discussed/approved via a Github issue or the forum? Please add a link
    to it if that's the case.
  • Did you make sure to update the documentation with your changes? Here are the
    documentation guidelines, and
    here are tips on formatting docstrings.
  • Did you write any new necessary tests?

This PR fixes a bug that prevented non-LoRA PEFT adapters to be loaded
into a transformers model. A test for this has been added.

Additionally, also testing if a non-LoRA adapter can be added to a
transformers model. This was not broken but still lacked test coverage.
@HuggingFaceDocBuilderDev

The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.

@BenjaminBossan
Copy link
Member Author

Not sure who would be best suited to review this from among the transformers devs, pinging @Cyrilvallez because you reviewed the original hotswap PR.

Personally, I'd consider this a regression introduced by #41297 so ideally it could be part of the v5 release :)

Copy link
Member

@Cyrilvallez Cyrilvallez left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks like the check is already performed above no? But I still think we should probably update the default value of hotswap in the function?? See also my comment here #41297 (comment)

peft_config.inference_mode = not is_trainable

if peft_config.peft_type != PeftType.LORA:
if hotswap and (peft_config.peft_type != PeftType.LORA):
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actually, we can fully remove this check no? It's already checked above L206

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Indeed, great catch. I removed it and confirmed that the corresponding test still passes. Please review again. Failing test_tokenization_mistral_common.py seem to be unrelated.

Copy link
Member

@Cyrilvallez Cyrilvallez left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Alright, LGTM then! Will merge as the failing test is unrelated

@Cyrilvallez Cyrilvallez merged commit d3ee06b into huggingface:main Dec 8, 2025
22 of 24 checks passed
@BenjaminBossan
Copy link
Member Author

Alright, LGTM then! Will merge as the failing test is unrelated

Fantastic, thanks. Anything I should do to ensure that this is included in v5?

@BenjaminBossan BenjaminBossan deleted the fix-peft-error-when-loading-non-lora branch December 8, 2025 14:03
@Cyrilvallez
Copy link
Member

Nop, will be automatically in the next rc1 release!

leaderofARS pushed a commit to leaderofARS/transformers that referenced this pull request Dec 9, 2025
* FIX Error when trying to load non-LoRA PEFT

This PR fixes a bug that prevented non-LoRA PEFT adapters to be loaded
into a transformers model. A test for this has been added.

Additionally, also testing if a non-LoRA adapter can be added to a
transformers model. This was not broken but still lacked test coverage.

* Reviewer feedback: Remove check completely
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants