Skip to content

Fix flash attention shard_map for sequence lengths not divisible by c…

9697900
Select commit
Loading
Failed to load commit list.
Open

[LTX-2] Fix flash attention shard_map for sequence lengths not divisible by context mesh axis #363

Fix flash attention shard_map for sequence lengths not divisible by c…
9697900
Select commit
Loading
Failed to load commit list.
Google CLA / cla/google succeeded Mar 25, 2026 in 7s

✅ All contributors are covered under a CLA with Google

See https://cla.developers.google.com/ for more info about Google's Contributor License Agreement (CLA).

ℹ️ Googlers: Go here to view more details and manage scans for this pull request.

Details

The following contributors were found for this pull request:

9697900 Author: @mbohlool <m***y​@google.com>

(Only the first commit for a unique contributor is listed.)