This is the official implementation of the paper Block Sparse Flash Attention. This preserves the fidelity of attention patterns while eliminating approximately 50% of FLOPs (the PV multiplication) ...
Some results have been hidden because they may be inaccessible to you
Show inaccessible results