Draft
Conversation
b7ffb2c to
00b198c
Compare
Codecov Report❌ Patch coverage is 📢 Thoughts on this report? Let us know! |
1b54aa5 to
ce2571a
Compare
- pure_nnx: a flag to to choose pure NNX logic when NNX and linen models co-exist. - init_state_fn: a function to initialize the model state for the training. It will be set to different function for NNX and Linen.
- Add utils to manipulate the NNX shardings with abstract state of a
model
- also add unit tests for the utils
- Extract mesh creation function to maxtext_utils.get_mesh_from_config()
- also add unit tests for this func
Note:
flax v0.12 has DeprecationWarning in multiple places:
- DeprecationWarning: '.value' access is now deprecated. Use
variable.get_value() or variable[...] (for [Array]).
- DeprecationWarning: 'VariableState' was removed, this is just
an alias to 'Variable'. Plase use 'Variable' directly instead.
But since the code needs to work with post-training, which currently
requires flax v0.11, we didn't change code for these warnings.
A TrainState for NNX, which includes model and optimizer Unit tests include checkpoint tests: - restore a saved state - convert linen TrainState to NNX TrainState - Parameter only restore (no opt_state)
Also added unit tests. Refactored the model_creation_utils to provide common def create_nnx_abstract_model() func b/src/maxtext/utils/model_creation_utils.py
1. A new func get_abstract_state_nnx() is added to maxtext_utils.py The it will be called during training to create NNX training state. Same as the linen version, it handles shard_optimizer_over_data, optimizer_memory_host_offload, and parameter_memory_host_offload Unit tests are added to this NNX func. 2. Add nnx train_state handling in train_utils.py DPO handling will be supported (or removed) later in train_utils.py
Also added unit tests
Also added unit tests for NNX model.
NNX: loss fn and train_step, eval_step, train_loop
- Convert nnx.State to pure dict for checkpoint saving - Restore pure dict back to nnx.State after loading
…store Add a bidirectional Linen <-> NNX checkpoint converter tool that handles: - Auto-detection of checkpoint format - Conversion of params structure (double nesting vs flat) - Stacking/unstacking per-layer parameters - Value wrapper handling for NNX format
ce2571a to
660f96d
Compare
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Description
Contains all the commits to make the NNX migrated MaxText work.
Tests
Current focus is on:
Example:
More features and models will be tested and compared with Linen.
Test track doc
Checklist
Before submitting this PR, please make sure (put X in square brackets):
gemini-reviewlabel.