Add support for overriding model architecture in Hugging Face conversion#3094
Open
Add support for overriding model architecture in Hugging Face conversion#3094
Conversation
Codecov Report❌ Patch coverage is
📢 Thoughts on this report? Let us know! |
entrpn
approved these changes
Feb 6, 2026
hengtaoguo
approved these changes
Feb 6, 2026
| hf_config_obj = HF_MODEL_CONFIGS[model_key] | ||
|
|
||
| # Validate architecture consistency (raising ValueError on mismatch) or override HF config if specified. | ||
| validate_or_update_architecture(hf_config_obj, config, override=FLAGS.override_model_architecture) |
Collaborator
There was a problem hiding this comment.
Great feature! This will update hf_config_obj in-place right?
hengtaoguo
reviewed
Feb 6, 2026
| } | ||
|
|
||
|
|
||
| def validate_or_update_architecture(hf_config, max_config, override: bool): |
Collaborator
There was a problem hiding this comment.
nit: I wonder if we should move this function to https://github.com/AI-Hypercomputer/maxtext/blob/main/src/MaxText/utils/ckpt_conversion/utils/utils.py. But either way works since this is only used in to_huggingface.
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Description
This PR adds a new flag
--override_model_architectureto the Hugging Face checkpoint conversion script (src/MaxText/utils/ckpt_conversion/to_huggingface.py).Why is this change being made?
Previously, the conversion script enforced a strict validation check that required the running MaxText configuration (passed via CLI/YAML) to exactly match the hardcoded Hugging Face configuration for a given
model_name. This prevented users from converting modified or experimental model architectures (e.g., a Llama 3.1 model) without manually modifying the source code to add a new static model config entry.The Solution:
This change introduces a boolean flag
override_model_architecture.ValueErrorlisting the differences.num_heads,hidden_size,num_layers,vocab_size) before saving theconfig.json.This allows for greater flexibility when working with custom model variants while maintaining safety defaults.
Tests
Tested the conversion script locally with a modified Llama 3.1 8B architecture (custom heads and head dimension).
Command used: