LoginSignup
1
0

More than 1 year has passed since last update.

xformersを使用するときのエラー対処法

Posted at

xFormersを使用するときに発生したエラーと対処法をメモします。

以下、発生したエラー。

NotImplementedError: No operator found for `memory_efficient_attention_forward` with inputs:
     query       : shape=(2, 4096, 8, 40) (torch.float16)
     key         : shape=(2, 4096, 8, 40) (torch.float16)
     value       : shape=(2, 4096, 8, 40) (torch.float16)
     attn_bias   : <class 'NoneType'>
     p           : 0.0
`cutlassF` is not supported because:
    xFormers wasn't build with CUDA support
`flshattF` is not supported because:
    xFormers wasn't build with CUDA support
`tritonflashattF` is not supported because:
    xFormers wasn't build with CUDA support
    requires A100 GPU
`smallkF` is not supported because:
    xFormers wasn't build with CUDA support
    dtype=torch.float16 (supported: {torch.float32})
    max(query.shape[-1] != value.shape[-1]) > 32
    unsupported embed per head: 40

どうやらインストールされているxformersのバージョンが合わなかったらしく、バージョンを変えてインストールし直すと対処できました。

$ pip uninstall xformers
$ pip install xformers==0.0.16
1
0
0

Register as a new user and use Qiita more conveniently

  1. You get articles that match your needs
  2. You can efficiently read back useful information
  3. You can use dark theme
What you can do with signing up
1
0