0
0

Delete article

Deleted articles cannot be recovered.

Draft of this article would be also deleted.

Are you sure you want to delete this article?

[python / transformers] Rust が原因で pip install transformers できない問題の対策

Last updated at Posted at 2024-02-19

概要

pip install transformers しようとしたところ、この人と同じ問題で、

以下のようなエラーが発生した(; ・`д・´)

発生したエラー

mac terminal
$ pip install transformers==4.21.2

...()

error: `cargo rustc --lib --message-format=json-render-diagnostics --manifest-path Cargo.toml --release -v --features pyo3/extension-module --crate-type cdylib -- -C 'link-args=-undefined dynamic_lookup -Wl,-install_name,@rpath/tokenizers.cpython-310-darwin.so'` failed with code 101
      [end of output]
  
  note: This error originates from a subprocess, and is likely not a problem with pip.
  ERROR: Failed building wheel for tokenizers
  Building wheel for pyperclip (setup.py) ... done
  Created wheel for pyperclip: filename=pyperclip-1.8.2-py3-none-any.whl size=11123 sha256=14dbdcd9ab428d312c569e32eed24ce86a26ce13da5d44bea6be4a0b19e61c0f
  Stored in directory: /Users/user_name/Library/Caches/pip/wheels/04/24/fe/140a94a7f1036003ede94579e6b4227fe96c840c6f4dcbe307
Successfully built grpcio grad-cam pyperclip
Failed to build tokenizers
ERROR: Could not build wheels for tokenizers, which is required to install pyproject.toml-based projects

注目すべきはここ。rustc

error: `cargo rustc ...

または、 tokenizer で以下のようなエラーが出ることもあるが、同根の問題。

#0 66.31 Building wheels for collected packages: grad-cam, tokenizers, psutil
#0 66.31   Building wheel for grad-cam (pyproject.toml): started
#0 66.44   Building wheel for grad-cam (pyproject.toml): finished with status 'done'
#0 66.44   Created wheel for grad-cam: filename=grad_cam-1.4.5-py3-none-any.whl size=37003 sha256=9e6af0f2da3ac76e2c6651fd9b10e26c23fa0fe4468876b11162168973c1beaa
#0 66.44   Stored in directory: /root/.cache/pip/wheels/0e/bd/d3/137eda21c45dae4fb5b587244cb747b76682a7506dee79fa9d
#0 66.44   Building wheel for tokenizers (pyproject.toml): started
#0 66.54   Building wheel for tokenizers (pyproject.toml): finished with status 'error'
#0 66.54   error: subprocess-exited-with-error
#0 66.54   
#0 66.54   × Building wheel for tokenizers (pyproject.toml) did not run successfully.
#0 66.54   │ exit code: 1
#0 66.54   ╰─> [51 lines of output]
#0 66.54       running bdist_wheel
#0 66.54       running build
#0 66.54       running build_py
#0 66.54       creating build
#0 66.54       creating build/lib.linux-aarch64-cpython-311
#0 66.54       creating build/lib.linux-aarch64-cpython-311/tokenizers
#0 66.54       copying py_src/tokenizers/__init__.py -> build/lib.linux-aarch64-cpython-311/tokenizers
#0 66.54       creating build/lib.linux-aarch64-cpython-311/tokenizers/models
#0 66.54       copying py_src/tokenizers/models/__init__.py -> build/lib.linux-aarch64-cpython-311/tokenizers/models
#0 66.54       creating build/lib.linux-aarch64-cpython-311/tokenizers/decoders
#0 66.54       copying py_src/tokenizers/decoders/__init__.py -> build/lib.linux-aarch64-cpython-311/tokenizers/decoders
#0 66.54       creating build/lib.linux-aarch64-cpython-311/tokenizers/normalizers
#0 66.54       copying py_src/tokenizers/normalizers/__init__.py -> build/lib.linux-aarch64-cpython-311/tokenizers/normalizers
#0 66.54       creating build/lib.linux-aarch64-cpython-311/tokenizers/pre_tokenizers
#0 66.54       copying py_src/tokenizers/pre_tokenizers/__init__.py -> build/lib.linux-aarch64-cpython-311/tokenizers/pre_tokenizers
#0 66.54       creating build/lib.linux-aarch64-cpython-311/tokenizers/processors
#0 66.54       copying py_src/tokenizers/processors/__init__.py -> build/lib.linux-aarch64-cpython-311/tokenizers/processors
#0 66.54       creating build/lib.linux-aarch64-cpython-311/tokenizers/trainers
#0 66.54       copying py_src/tokenizers/trainers/__init__.py -> build/lib.linux-aarch64-cpython-311/tokenizers/trainers
#0 66.54       creating build/lib.linux-aarch64-cpython-311/tokenizers/implementations
#0 66.54       copying py_src/tokenizers/implementations/__init__.py -> build/lib.linux-aarch64-cpython-311/tokenizers/implementations
#0 66.54       copying py_src/tokenizers/implementations/sentencepiece_bpe.py -> build/lib.linux-aarch64-cpython-311/tokenizers/implementations
#0 66.54       copying py_src/tokenizers/implementations/byte_level_bpe.py -> build/lib.linux-aarch64-cpython-311/tokenizers/implementations
#0 66.54       copying py_src/tokenizers/implementations/bert_wordpiece.py -> build/lib.linux-aarch64-cpython-311/tokenizers/implementations
#0 66.54       copying py_src/tokenizers/implementations/char_level_bpe.py -> build/lib.linux-aarch64-cpython-311/tokenizers/implementations
#0 66.54       copying py_src/tokenizers/implementations/base_tokenizer.py -> build/lib.linux-aarch64-cpython-311/tokenizers/implementations
#0 66.54       copying py_src/tokenizers/implementations/sentencepiece_unigram.py -> build/lib.linux-aarch64-cpython-311/tokenizers/implementations
#0 66.54       creating build/lib.linux-aarch64-cpython-311/tokenizers/tools
#0 66.54       copying py_src/tokenizers/tools/visualizer.py -> build/lib.linux-aarch64-cpython-311/tokenizers/tools
#0 66.54       copying py_src/tokenizers/tools/__init__.py -> build/lib.linux-aarch64-cpython-311/tokenizers/tools
#0 66.54       copying py_src/tokenizers/__init__.pyi -> build/lib.linux-aarch64-cpython-311/tokenizers
#0 66.54       copying py_src/tokenizers/models/__init__.pyi -> build/lib.linux-aarch64-cpython-311/tokenizers/models
#0 66.54       copying py_src/tokenizers/decoders/__init__.pyi -> build/lib.linux-aarch64-cpython-311/tokenizers/decoders
#0 66.54       copying py_src/tokenizers/normalizers/__init__.pyi -> build/lib.linux-aarch64-cpython-311/tokenizers/normalizers
#0 66.54       copying py_src/tokenizers/pre_tokenizers/__init__.pyi -> build/lib.linux-aarch64-cpython-311/tokenizers/pre_tokenizers
#0 66.54       copying py_src/tokenizers/processors/__init__.pyi -> build/lib.linux-aarch64-cpython-311/tokenizers/processors
#0 66.54       copying py_src/tokenizers/trainers/__init__.pyi -> build/lib.linux-aarch64-cpython-311/tokenizers/trainers
#0 66.54       copying py_src/tokenizers/tools/visualizer-styles.css -> build/lib.linux-aarch64-cpython-311/tokenizers/tools
#0 66.54       running build_ext
#0 66.54       running build_rust
#0 66.54       error: can't find Rust compiler
#0 66.54       
#0 66.54       If you are using an outdated pip version, it is possible a prebuilt wheel is available for this package but pip is not able to install from it. Installing from the wheel would avoid the need for a Rust compiler.
#0 66.54       
#0 66.54       To update pip, run:
#0 66.54       
#0 66.54           pip install --upgrade pip
#0 66.54       
#0 66.54       and then retry package installation.
#0 66.54       
#0 66.54       If you did intend to build this package from source, try installing a Rust compiler from your system package manager and ensure it is on the PATH during installation. Alternatively, rustup (available at https://rustup.rs) is the recommended way to download and update the Rust compiler toolchain.
#0 66.54       [end of output]
#0 66.54   
#0 66.54   note: This error originates from a subprocess, and is likely not a problem with pip.
#0 66.54   ERROR: Failed building wheel for tokenizers
#0 66.54   Building wheel for psutil (pyproject.toml): started
#0 67.57   Building wheel for psutil (pyproject.toml): finished with status 'done'
#0 67.57   Created wheel for psutil: filename=psutil-5.9.8-cp311-abi3-linux_aarch64.whl size=290017 sha256=0c10ef47d0fa17b698f7ea176fc7d909045617197174955e9ee8770280f13d49
#0 67.57   Stored in directory: /root/.cache/pip/wheels/a6/1e/65/fb0ad37886dca3f25a0aa8e50f4903c5bdbde4bb8a9b1e27de
#0 67.57 Successfully built grad-cam psutil
#0 67.57 Failed to build tokenizers
#0 67.57 ERROR: Could not build wheels for tokenizers, which is required to install pyproject.toml-based projects
------
failed to solve: executor failed running [/bin/sh -c pip install -Ur requirements.txt]: exit code: 1

このようなエラーの時に注目すべきはここ。

error: can't find Rust compiler

最近気づいたのですが、なんか、 M2 Mac と rust って相性悪いっぽいんですよね(M1もかもしれない)。1

Mac のせいだと思ってたら、Mac 関係ない気がしてきた。 Docker イメージ上でも発生した。

環境

項目 version など
OS MacOS or Ubuntu の Docker イメージ上でも発生した
CPU Apple M2
python 3.10.11 or 3.11 でも発生
transformers 4.21.2

解決策

「うわ~面倒なやつだ~」。゚・(´^ω^`)・゚。と思ってたら、transformers のつい最近のアップデートにより、インストールできるようになってました。

version 番号として、 transformers==4.37.2 を指定してあげればOK だった。

ただし、古い transformers - 例えば 4.22.2 とか - をどうしても使いたい場合は、他の解決策を探すことになる。

関連記事

もしかしたら、以下の issue コメントにあるように、 rust をインストールする方法でも問題は解決できるかもしれない。といっても、 version 上げるだけの方が簡単なので、自分は version 上げて対応したけどね!(*˘︶˘*)

  1. sudachipyのinstallの時もrustで引っかかって記事を書いたばかりなのです。

0
0
0

Register as a new user and use Qiita more conveniently

  1. You get articles that match your needs
  2. You can efficiently read back useful information
  3. You can use dark theme
What you can do with signing up
0
0

Delete article

Deleted articles cannot be recovered.

Draft of this article would be also deleted.

Are you sure you want to delete this article?