Skip to content

Fix: Handle empty masks in get_mask_boxes to prevent crash#328

Open
pranavchugh1 wants to merge 1 commit intoWan-Video:mainfrom
pranavchugh1:fix-empty-mask-crash
Open

Fix: Handle empty masks in get_mask_boxes to prevent crash#328
pranavchugh1 wants to merge 1 commit intoWan-Video:mainfrom
pranavchugh1:fix-empty-mask-crash

Conversation

@pranavchugh1
Copy link

Problem

When running without flash_attn (mode="torch" or "vanilla"), attention fails with dimension mismatch. pre_attn_layout was defined but never applied.

Solution

Applied pre_attn_layout transformation to q, k, v tensors in both torch and vanilla modes.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant