Skip to content

Reduce peak vram in moe.py by avoiding cached router indices #464

Open
shedrachIkenna wants to merge 1 commit intometa-llama:mainfrom
shedrachIkenna:patch-1
Open

Reduce peak vram in moe.py by avoiding cached router indices #464
shedrachIkenna wants to merge 1 commit intometa-llama:mainfrom
shedrachIkenna:patch-1

Conversation

@shedrachIkenna
Copy link

Currently, the expanded router indices in moe.py forward pass is computed on the fly in gather operation but it is recomputed and assigned to a variable just before the scatter_add_ operation. Holding them in memory while experts are executing increases peak memory pressure during the most memory-intensive part of the layer (expert mlps)

This PR switches to calculating the indices on-the-fly for both the gather and scatter_add_ operations. This way, the index tensor can be garbage collected while the experts are running, which effectively reduces memory usage of the forward pass with negligible compute overhead.

Applied a memory efficient forward pass by reshaping and expanding router_indices twice instead of saving in memory which was never used later on
@meta-cla
Copy link

meta-cla bot commented Feb 25, 2026

Hi @shedrachIkenna!

Thank you for your pull request and welcome to our community.

Action Required

In order to merge any pull request (code, docs, etc.), we require contributors to sign our Contributor License Agreement, and we don't seem to have one on file for you.

Process

In order for us to review and merge your suggested changes, please sign at https://code.facebook.com/cla. If you are contributing on behalf of someone else (eg your employer), the individual CLA may not be sufficient and your employer may need to sign the corporate CLA.

Once the CLA is signed, our tooling will perform checks and validations. Afterwards, the pull request will be tagged with CLA signed. The tagging process may take up to 1 hour after signing. Please give it that time before contacting us about it.

If you have received this in error or have any questions, please contact us at cla@meta.com. Thanks!

@meta-cla
Copy link

meta-cla bot commented Feb 26, 2026

Thank you for signing our Contributor License Agreement. We can now accept your code for this (and any) Meta Open Source project. Thanks!

@meta-cla meta-cla bot added the CLA Signed This label is managed by the Meta Open Source bot. label Feb 26, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

CLA Signed This label is managed by the Meta Open Source bot.

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant