Skip to content

[feat] Disable r3 for MTP#608

Open
guapisolo wants to merge 1 commit intoradixark:feature/r3-upstreamfrom
guapisolo:feature/r3-mtp-fix
Open

[feat] Disable r3 for MTP#608
guapisolo wants to merge 1 commit intoradixark:feature/r3-upstreamfrom
guapisolo:feature/r3-mtp-fix

Conversation

@guapisolo
Copy link
Collaborator

Disable r3 replay for mtp

Already tested on GLM-4.7-Flash.

This impl is a little bit hard code, but maybe the best impl in current design. Mainly because we cannot get the module name before the layer is fully registered, so the Replay class cannot record anything about the layer name. Also, the MoE routing function does not included module info, so we cannot skip MoE mtp replay during runtime...

@gemini-code-assist
Copy link
Contributor

Summary of Changes

Hello @guapisolo, I'm Gemini Code Assist1! I'm currently reviewing this pull request and will post my feedback shortly. In the meantime, here's a summary to help you and other reviewers quickly get up to speed!

This pull request addresses an incompatibility between the rollout routing replay (r3) system and Megatron-LM Transformer Parallel (MTP) layers. It introduces a mechanism to temporarily disable r3 during the setup of MTP modules, preventing erroneous replay registration and ensuring proper model initialization. The change is a targeted workaround due to current design limitations in obtaining module names before full layer registration and the absence of module info in MoE routing functions.

Highlights

  • MTP Replay Disablement: Implemented a temporary disablement of the rollout routing replay (r3) mechanism specifically for Megatron-LM Transformer Parallel (MTP) layers. This ensures that r3 does not attempt to register or record MTP module information, which is not currently supported.
Changelog
  • miles/backends/megatron_utils/model_provider.py
    • Imported the logging module for emitting warnings.
    • Imported routing_replay_manager from miles.utils.replay_base.
    • Added conditional logic to temporarily disable routing_replay_manager when args.use_rollout_routing_replay is true, specifically during the creation of mtp_block_spec.
    • Included a logger.warning message to inform users when replay registration is skipped for MTP modules.
    • Re-enabled routing_replay_manager after mtp_block_spec creation.
Activity
  • No specific activity (comments, reviews, progress) has been recorded for this pull request yet.
Using Gemini Code Assist

The full guide for Gemini Code Assist can be found on our documentation page, here are some quick tips.

Invoking Gemini

You can request assistance from Gemini at any point by creating a comment using either /gemini <command> or @gemini-code-assist <command>. Below is a summary of the supported commands on the current page.

Feature Command Description
Code Review /gemini review Performs a code review for the current pull request in its current state.
Pull Request Summary /gemini summary Provides a summary of the current pull request in its current state.
Comment @gemini-code-assist Responds in comments when explicitly tagged, both in pull request comments and review comments.
Help /gemini help Displays a list of available commands.

Customization

To customize Gemini Code Assist for GitHub experience, repository maintainers can create a configuration file and/or provide a custom code review style guide (such as PEP-8 for Python) by creating and adding files to a .gemini/ folder in the base of the repository. Detailed instructions can be found here.

Limitations & Feedback

Gemini Code Assist may make mistakes. Please leave feedback on any instances where its feedback is incorrect or counter productive. You can react with 👍 and 👎 on @gemini-code-assist comments. If you're interested in giving your feedback about your experience with Gemini Code Assist for Github and other Google products, sign up here.

You can also get AI-powered code generation, chat, as well as code reviews directly in the IDE at no cost with the Gemini Code Assist IDE Extension.

Footnotes

  1. Review the Privacy Notices, Generative AI Prohibited Use Policy, Terms of Service, and learn how to configure Gemini Code Assist in GitHub here. Gemini can make mistakes, so double check it and use code with caution.

Copy link
Contributor

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request disables rollout routing replay (r3) for Mixture of Tensors Parallelism (MTP) modules by temporarily setting routing_replay_manager.enabled to False. While this approach works, the current implementation is not exception-safe. My review includes a suggestion to use a try...finally block to ensure the replay manager's state is correctly restored even if an error occurs during MTP block creation. This will make the implementation more robust.

Comment on lines +194 to +203
# hard code here to skip r3 registration for mtp layers
if args.use_rollout_routing_replay:
routing_replay_manager.enabled = False
logger.warning(
"Rollout routing replay is not applicable for MTP modules, so skipped replay registration"
)
mtp_block_spec = get_gpt_mtp_block_spec(config, transformer_layer_spec, **mtp_kwargs)
kwargs["mtp_block_spec"] = mtp_block_spec
if args.use_rollout_routing_replay:
routing_replay_manager.enabled = True
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

high

The current approach to temporarily disable the routing_replay_manager is not exception-safe. If an exception is raised within get_gpt_mtp_block_spec, routing_replay_manager.enabled will remain False, potentially causing issues later in the execution. Using a try...finally block will guarantee that the manager's state is restored correctly.

Suggested change
# hard code here to skip r3 registration for mtp layers
if args.use_rollout_routing_replay:
routing_replay_manager.enabled = False
logger.warning(
"Rollout routing replay is not applicable for MTP modules, so skipped replay registration"
)
mtp_block_spec = get_gpt_mtp_block_spec(config, transformer_layer_spec, **mtp_kwargs)
kwargs["mtp_block_spec"] = mtp_block_spec
if args.use_rollout_routing_replay:
routing_replay_manager.enabled = True
# hard code here to skip r3 registration for mtp layers
if args.use_rollout_routing_replay:
routing_replay_manager.enabled = False
logger.warning(
"Rollout routing replay is not applicable for MTP modules, so skipped replay registration"
)
try:
mtp_block_spec = get_gpt_mtp_block_spec(config, transformer_layer_spec, **mtp_kwargs)
kwargs["mtp_block_spec"] = mtp_block_spec
finally:
if args.use_rollout_routing_replay:
routing_replay_manager.enabled = True

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant