Skip to content

Conversation

@kunlunl
Copy link
Contributor

@kunlunl kunlunl commented Nov 13, 2025

What does this PR do ?

dev MR: #2086

Major changes:

Make Megatron-FSDP’s AG pipeline support using different data parallel buffers, because MXFP8 has different quantization direction in forward and backward passes.
Decouple the FP8-related logic from the main workflow and provide a unified abstraction to 1) operate the raw data storage of different recipes; 2) Create or discard transpose cache for different recipes.

⚠️ For major changes (either in lines of code or in its impact), please make sure to first share discuss a design-doc with the team.

Contribution process

flowchart LR
    A[Pre-checks] --> B[PR Tests]
    subgraph Code Review/Approval
        C1[Expert Review] --> C2[Final Review]
    end
    B --> C1
    C2 --> D[Merge]
Loading

Pre-checks

  • I want this PR in a versioned release and have added the appropriate Milestone (e.g., Core 0.8)
  • I have added relevant unit tests
  • I have added relevant functional tests
  • I have added proper typing to my code Typing guidelines
  • I have added relevant documentation
  • I have run the autoformatter.sh on my PR

Code review

The following process is enforced via the CODEOWNERS file for changes into megatron/core. For changes outside of megatron/core, it is up to the PR author whether or not to tag the Final Reviewer team.

For MRs into `main` branch

(Step 1): Add PR label Expert Review

(Step 2): Collect the expert reviewers reviews

  1. Attach the Expert Review label when your PR is ready for review.
  2. GitHub auto-assigns expert reviewers based on your changes. They will get notified and pick up your PR soon.

⚠️ Only proceed to the next step once all reviewers have approved, merge-conflict are resolved and the CI is passing.
Final Review might get declined if these requirements are not fulfilled.

(Step 3): Final Review

  1. Add Final Review label
  2. GitHub auto-assigns final reviewers based on your changes. They will get notified and pick up your PR soon.

(Optional Step 4): Cherry-pick into release branch

If this PR also needs to be merged into core_r* release branches, after this PR has been merged, select Cherry-pick to open a new PR into the release branch.

For MRs into `dev` branch The proposed review process for `dev` branch is under active discussion.

MRs are mergable after one approval by either [email protected] or [email protected].

Merging your PR

Any member of core-adlr and core-nemo will be able to merge your PR.

@kunlunl kunlunl requested a review from a team as a code owner November 13, 2025 07:54
@copy-pr-bot
Copy link

copy-pr-bot bot commented Nov 13, 2025

This pull request requires additional validation before any workflows can run on NVIDIA's runners.

Pull request vetters can view their responsibilities here.

Contributors can view more details about this message here.

@Skylion007
Copy link
Contributor

Skylion007 commented Nov 22, 2025

Is FP8 activations/grad support on Hopper FSDP with block wise support on the roadmap as well? O-o

@kunlunl
Copy link
Contributor Author

kunlunl commented Nov 24, 2025

Is FP8 activations/grad support on Hopper FSDP with block wise support on the roadmap as well? O-o

@shjwudp

@shjwudp shjwudp changed the title FP8 params support for megatron-fsdp FP8 params support for megatron-fsdp (MXFP8/Blockwise) Dec 5, 2025
@cspades
Copy link
Member

cspades commented Dec 16, 2025

See comment on Dev PR: #2086 (comment)

Can we add a few simple unit tests?

return TE_VERSION > PkgVersion(vers)


def is_float8tensor(tensor: torch.Tensor) -> bool:
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
def is_float8tensor(tensor: torch.Tensor) -> bool:
def is_float8tensor(tensor: torch.Tensor) -> TypeGuard[FP8_TENSOR_CLASS]:

@ericharper ericharper added Expert Review Apply this label to indicate that your PR is ready for expert review. module: megatron-fsdp and removed module: megatron-fsdp labels Dec 18, 2025
@shjwudp
Copy link
Contributor

shjwudp commented Dec 18, 2025

Is FP8 activations/grad support on Hopper FSDP with block wise support on the roadmap as well? O-o

Hi @Skylion007 ,

Do you mean FP8 activations + param support on Hopper? We has already been merged into the dev branch (PR-2086), and we'll look into merging it into the main branch when we have time.

@cspades
Copy link
Member

cspades commented Jan 1, 2026

/ok to test 42a6bdc

@kunlunl kunlunl force-pushed the kunlunl/megatron-fsdp-fp8-params_main branch from 42a6bdc to feb6753 Compare January 5, 2026 07:21
@shjwudp
Copy link
Contributor

shjwudp commented Jan 5, 2026

/ok to test feb6753

# to allocate as little memory as possible for this forward pass.
param_list = list(module.parameters(recurse=False))

if self.enable_fine_grained_param_gather_hook:
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Same here

else:
param_list = list(module.parameters(recurse=False))

if self.enable_fine_grained_param_gather_hook:
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

same here

@Phlip79 Phlip79 added the dev2main: mbridge dev to main: this PR is needed in main for mbridge label Jan 5, 2026
shjwudp and others added 2 commits January 6, 2026 12:47
@shjwudp
Copy link
Contributor

shjwudp commented Jan 6, 2026

/ok to test 31624f7

@shjwudp
Copy link
Contributor

shjwudp commented Jan 6, 2026

/ok to test 31624f7

Copy link
Member

@cspades cspades left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Finally found time to prototype this backend implementation in fully_shard, I'm generally happy with this PR. I'll submit a follow-up PR directly to main that exposes FP8 parameter support to fully_shard and a unit test for it as well.

@shjwudp @kunlunl I do have a comment beyond the scope of this PR though, pertaining to this fp8_model_init: https://github.com/NVIDIA/Megatron-LM/blob/dev/megatron/core/distributed/fsdp/src/megatron_fsdp/param_and_grad_buffer.py#L3738

We should move this to mcore_fsdp_adapter.py during MegatronFSDP.__init__ so both in Megatron-LM and native Torch, we can have the same initialization pattern:

# Construct toy model.
with te.pytorch.quantized_model_init(
    enabled=True,
    recipe=fp8_recipe,
    # Needed for FP8 parameters with Megatron-FSDP.
    preserve_high_precision_init_val=True,
):
    toy_model = ToyTETransformer(
        model_dim=DIM_SIZE,
        num_heads=2,
        num_layers=NUM_LAYERS,
        output_dim=DIM_SIZE,
        device="meta"
    )

# Fully-shard the model.
# NOTE: We do NOT need the quantized_model_init context manager for Megatron-FSDP,
# because it has already setup the correct state during the root module FP8 init, I believe?
mfsdp_model = fully_shard_model(
    module=toy_model,
    fsdp_unit_modules=[te.pytorch.TransformerLayer, te.pytorch.Linear],
    zero_dp_strategy=3,
    init_model_with_meta_device=True,
)

This should not break Megatron-LM right? (Testing...) I believe this also means we do not need an fp8_param_gather argument either!

@Phlip79 Phlip79 added the Final Review Apply this label to indicate that your PR is ready for final review. label Jan 7, 2026
@cspades
Copy link
Member

cspades commented Jan 7, 2026

/ok to test cba67e3

@cspades
Copy link
Member

cspades commented Jan 7, 2026

API compatibility check error is expected, just like in the DEV PR, with the same violations.

ko3n1g added a commit to ko3n1g/Megatron-LM that referenced this pull request Jan 8, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

complexity: high dev2main: mbridge dev to main: this PR is needed in main for mbridge Expert Review Apply this label to indicate that your PR is ready for expert review. Final Review Apply this label to indicate that your PR is ready for final review. module: megatron-fsdp

Projects

None yet

Development

Successfully merging this pull request may close these issues.

9 participants