! 0 random page - Serbipunk/notes GitHub Wiki
support flash_attn_gpu on older gpu
File "E:\tool\trellis2.0\ComfyUI\ComfyUI\ComfyUI-Easy-Install\python_embeded\Lib\site-packages\flash_attn\flash_attn_interface.py", line 91, in _flash_attn_forward
out, softmax_lse, S_dmask, rng_state = flash_attn_gpu.fwd(
^^^^^^^^^^^^^^^^^^^
RuntimeError: FlashAttention only supports Ampere GPUs or newer.
Exception raised from mha_fwd at D:\a\flash-attention\flash-attention\csrc\flash_attn\flash_api.cpp:370 (most recent call first):
企业级多智能体设计实战
https://time.geekbang.org/course/intro/101114301?tab=catalog