2
1

Delete article

Deleted articles cannot be recovered.

Draft of this article would be also deleted.

Are you sure you want to delete this article?

More than 3 years have passed since last update.

Huggingface Transformerがエラーを吐かない実行環境は、Python3.6.3 & TensorFlow2.2だと確認した件

Posted at

###huggingface Transformerのサンプルコードを動かすまでの長い旅

以下のサンプルコードを実行できるようにするまで、あれこれとエラーつぶしを行った記録

https://pypi.org/project/transformers/ "transformers 4.0.1"

Transformer(サンプルコード)
>>> from transformers import pipeline

# Allocate a pipeline for sentiment-analysis
>>> classifier = pipeline('sentiment-analysis')
>>> classifier('We are very happy to include pipeline into the transformers repository.')
[{'label': 'POSITIVE', 'score': 0.9978193640708923}]

##git cloneで資源を取得

Terminal
pwd
/Users/ocean/Desktop
% mkdir huggingface
% cd huggingface 
%
%git clone https://github.com/huggingface/transformers.git
Cloning into 'transformers'...
remote: Enumerating objects: 75, done.
remote: Counting objects: 100% (75/75), done.
remote: Compressing objects: 100% (65/65), done.
remote: Total 55871 (delta 25), reused 41 (delta 4), pack-reused 55796
Receiving objects: 100% (55871/55871), 41.77 MiB | 11.59 MiB/s, done.
Resolving deltas: 100% (39135/39135), done.
%
%ls
transformers
%tree | wc -l                                             
    2682
%
%tree | head -50
.
└── transformers
    ├── CODE_OF_CONDUCT.md
    ├── CONTRIBUTING.md
    ├── LICENSE
    ├── MANIFEST.in
    ├── Makefile
    ├── README.md
    ├── docker
    │   ├── transformers-cpu
    │   │   └── Dockerfile
    │   ├── transformers-gpu
    │   │   └── Dockerfile
    │   ├── transformers-pytorch-cpu
    │   │   └── Dockerfile
    │   ├── transformers-pytorch-gpu
    │   │   └── Dockerfile
    │   ├── transformers-pytorch-tpu
    │   │   ├── Dockerfile
    │   │   ├── bert-base-cased.jsonnet
    │   │   ├── dataset.yaml
    │   │   └── docker-entrypoint.sh
    │   ├── transformers-tensorflow-cpu
    │   │   └── Dockerfile
    │   └── transformers-tensorflow-gpu
    │       └── Dockerfile
    ├── docs
    │   ├── Makefile
    │   ├── README.md
    │   └── source
    │       ├── _static
    │       │   ├── css
    │       │   │   ├── Calibre-Light.ttf
    │       │   │   ├── Calibre-Medium.otf
    │       │   │   ├── Calibre-Regular.otf
    │       │   │   ├── Calibre-Thin.otf
    │       │   │   ├── code-snippets.css
    │       │   │   └── huggingface.css
    │       │   └── js
    │       │       ├── custom.js
    │       │       └── huggingface_logo.svg
    │       ├── benchmarks.rst
    │       ├── bertology.rst
    │       ├── conf.py
    │       ├── contributing.md -> ../../CONTRIBUTING.md
    │       ├── converting_tensorflow_models.rst
    │       ├── custom_datasets.rst
    │       ├── examples.md -> ../../examples/README.md
    │       ├── favicon.ico
    │       ├── glossary.rst
%

##dyld: lazy symbol binding failed: Symbol not foundエラー:cry:

Python3
>>> from transformers import pipeline
>>> classifier = pipeline('sentiment-analysis')
>>> classifier('We are very happy to include pipeline into the transformers repository.')
dyld: lazy symbol binding failed: Symbol not found: _PySlice_Unpack
  Referenced from: /Users/ocean/.pyenv/versions/3.6.0/envs/TensorFlow/lib/python3.6/site-packages/torch/lib/libtorch_python.dylib
  Expected in: flat namespace

dyld: Symbol not found: _PySlice_Unpack
  Referenced from: /Users/ocean/.pyenv/versions/3.6.0/envs/TensorFlow/lib/python3.6/site-packages/torch/lib/libtorch_python.dylib
  Expected in: flat namespace

zsh: abort      python
%

#####( 参考ウェブサイト )

Easy Ramble dyld: lazy symbol binding failed: Symbol not found: _iconv_open エラー

DYLD_LIBRARY_PATH 環境変数を設定して解決
またも、stackoverflow に助けられて解決。環境変数 DYLD_LIBRARY_PATH に、dyld ライブ> ラリへのパスが入っている必要があるそうです。末尾リンク参照。

$ export DYLD_LIBRARY_PATH=/usr/lib/:$DYLD_LIBRARY_PATH
$ export DYLD_LIBRARY_PATH=/usr/lib/:$DYLD_LIBRARY_PATH

これでOK。あるいは、~/.zshrc に以下を設定。(bash なら ~/.bashrc, ~/.bash_profile など)

$ vi ~/.zshrc
export DYLD_LIBRARY_PATH=/usr/lib/:$DYLD_LIBRARY_PATH
$ source ~/.zshrc

$ vi ~/.zshrc
export DYLD_LIBRARY_PATH=/usr/lib/:$DYLD_LIBRARY_PATH
$ source ~/.zshrc

Terminal
echo "export DYLD_LIBRARY_PATH=/usr/lib/:$DYLD_LIBRARY_PATH" >> ~/.bashrc
% cat ~/.bashrc                                                            
export PATH=$PATH:/usr/local/Cellar/knp/4.19/bin
export PATH=$HOME/.local/bin:$PATH
export DYLD_LIBRARY_PATH=/usr/lib/:/usr/lib/:/usr/lib/:
% 
% source ~/.bashrc                                                         
%
% python
Python 3.6.0 (default, Dec  8 2020, 23:48:20) 
[GCC Apple LLVM 12.0.0 (clang-1200.0.32.27)] on darwin
Type "help", "copyright", "credits" or "license" for more information.
>>>

###エラー消えない:disappointed_relieved:

Python3.6.0
>>> from transformers import pipeline
>>> classifier = pipeline('sentiment-analysis')
>>> classifier('We are very happy to include pipeline into the transformers repository.')
dyld: lazy symbol binding failed: Symbol not found: _PySlice_Unpack
  Referenced from: /Users/ocean/.pyenv/versions/3.6.0/envs/TensorFlow/lib/python3.6/site-packages/torch/lib/libtorch_python.dylib
  Expected in: flat namespace

dyld: Symbol not found: _PySlice_Unpack
  Referenced from: /Users/ocean/.pyenv/versions/3.6.0/envs/TensorFlow/lib/python3.6/site-packages/torch/lib/libtorch_python.dylib
  Expected in: flat namespace

zsh: abort      python
 

#####( 参考ウェブサイト )
( GitHub )pytorch/pytorch/issues/17237

Additional Info
Installed using pip3 install torch

@sarahwie
Author
sarahwie commented on 19 Feb 2019
Resolved by upgrading to >= Python 3.6.1

soumith commented on 19 Feb 2019 •
this is a known issue with python 3.6.0, if my memory serves me right. see https://>bugzilla.redhat.com/show_bug.cgi?id=1435135

Upgrade to python 3.6.2 or greater.
(the issue has nothing to do with pytorch and is an abi incompatibility issue across 3.6.0 / 3.6.1)

#####( 参考ウェブサイト )
https://github.com/huggingface/transformers/issues/7333

Upgrading to TF 2.2 works fine, but I think this should be made more clear in the docs.

###TensorFlow2.0をインストール

Terminal
% python --version
Python 3.6.1
% pip install tensorflow==2.0
Collecting tensorflow==2.0
  Downloading https://files.pythonhosted.org/packages/c8/a1/2ab46a175c916b0149ccb9edc06202bce6365455779fa251c1f59a4c7806/tensorflow-2.0.0-cp36-cp36m-macosx_10_11_x86_64.whl (102.7MB)
    100% |████████████████████████████████| 102.7MB 13kB/s 

( ・・・省略・・・)

 You are using pip version 9.0.1, however version 20.3.1 is available.
You should consider upgrading via the 'pip install --upgrade pip' command.
%
% python3 -c 'import tensorflow as tf; print(tf.__version__)'  
2.0.0

####エラー

% python
Python 3.6.1 (default, Dec 10 2020, 22:31:19) 
[GCC Apple LLVM 12.0.0 (clang-1200.0.32.27)] on darwin
Type "help", "copyright", "credits" or "license" for more information.
>>> from transformers import pipeline
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/Users/ocean/.pyenv/versions/3.6.1/lib/python3.6/site-packages/transformers/__init__.py", line 34, in <module>
    from .data import (
  File "/Users/ocean/.pyenv/versions/3.6.1/lib/python3.6/site-packages/transformers/data/__init__.py", line 6, in <module>
    from .processors import (
  File "/Users/ocean/.pyenv/versions/3.6.1/lib/python3.6/site-packages/transformers/data/processors/__init__.py", line 6, in <module>
    from .squad import SquadExample, SquadFeatures, SquadV1Processor, SquadV2Processor, squad_convert_examples_to_features
  File "/Users/ocean/.pyenv/versions/3.6.1/lib/python3.6/site-packages/transformers/data/processors/squad.py", line 10, in <module>
    from ...models.bert.tokenization_bert import whitespace_tokenize
  File "/Users/ocean/.pyenv/versions/3.6.1/lib/python3.6/site-packages/transformers/models/bert/__init__.py", line 31, in <module>
    from .modeling_tf_bert import (
  File "/Users/ocean/.pyenv/versions/3.6.1/lib/python3.6/site-packages/transformers/models/bert/modeling_tf_bert.py", line 24, in <module>
    from ...activations_tf import get_tf_activation
  File "/Users/ocean/.pyenv/versions/3.6.1/lib/python3.6/site-packages/transformers/activations_tf.py", line 54, in <module>
    "swish": tf.keras.activations.swish,
AttributeError: module 'tensorflow_core.keras.activations' has no attribute 'swish'
>>> 

#####( 再掲 )
( GitHub )pytorch/pytorch/issues/17237

Additional Info
Installed using pip3 install torch

@sarahwie
Upgrade to python 3.6.2 or greater.
(the issue has nothing to do with pytorch and is an abi incompatibility issue across 3.6.0 / 3.6.1)

###Python3.6.3をインストール

Terminal
% pyenv install 3.6.3 
Terminal
% pyenv versions
  system
  3.6.0
  3.6.0/envs/TensorFlow
* 3.6.1 (set by /Users/ocean/Desktop/.python-version)
  3.6.3
  3.9.0
  TensorFlow
% pyenv local 3.6.3
% python --version
Python 3.6.3

#####( 再掲 )
https://github.com/huggingface/transformers/issues/7333

Upgrading to TF 2.2 works fine, but I think this should be made more clear in the docs.

###TensorFlow2.2をインストール

% pip install tensorflow==2.2
Collecting tensorflow==2.2
  Downloading tensorflow-2.2.0-cp36-cp36m-macosx_10_11_x86_64.whl (175.3 MB)
     |████████████████████████████████| 175.3 MB 132 kB/s 

(・・・省略・・・)

Successfully installed absl-py-0.11.0 astunparse-1.6.3 cachetools-4.1.1 certifi-2020.12.5 chardet-3.0.4 gast-0.3.3 google-auth-1.23.0 google-auth-oauthlib-0.4.2 google-pasta-0.2.0 grpcio-1.34.0 h5py-2.10.0 idna-2.10 importlib-metadata-3.1.1 keras-preprocessing-1.1.2 markdown-3.3.3 numpy-1.19.4 oauthlib-3.1.0 opt-einsum-3.3.0 protobuf-3.14.0 pyasn1-0.4.8 pyasn1-modules-0.2.8 requests-2.25.0 requests-oauthlib-1.3.0 rsa-4.6 scipy-1.4.1 setuptools-51.0.0 six-1.15.0 tensorboard-2.2.2 tensorboard-plugin-wit-1.7.0 tensorflow-2.2.0 tensorflow-estimator-2.2.0 termcolor-1.1.0 urllib3-1.26.2 werkzeug-1.0.1 wheel-0.36.1 wrapt-1.12.1 zipp-3.4.0
% 
% python3 -c 'import tensorflow as tf; print(tf.__version__)'  # for Python 3
2.2.0
% 

#####Python3.6.3に、まだtransforlermモジュールを入れていなかった。。。:weary:

% python
Python 3.6.3 (default, Dec 10 2020, 22:43:16) 
[GCC Apple LLVM 12.0.0 (clang-1200.0.32.27)] on darwin
Type "help", "copyright", "credits" or "license" for more information.
>>> 
>>> from transformers import pipeline
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
ModuleNotFoundError: No module named 'transformers'
>>> quit()
Terminal
% pip install transformers
(・・・省略・・・)
Successfully installed click-7.1.2 dataclasses-0.8 filelock-3.0.12 joblib-0.17.0 packaging-20.7 pyparsing-2.4.7 regex-2020.11.13 sacremoses-0.0.43 tokenizers-0.9.4 tqdm-4.54.1 transformers-4.0.1
% 

#やっと動いた!!!:sun_with_face:

>>> from transformers import pipeline
>>> classifier = pipeline('sentiment-analysis')
Downloading: 100%|██████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 268M/268M [00:17<00:00, 15.7MB/s]
2020-12-10 22:51:08.930073: W tensorflow/python/util/util.cc:329] Sets are not currently considered sequences, but this may change in the future, so consider avoiding using them.

( ・・・省略・・・ )

You should probably TRAIN this model on a down-stream task to be able to use it for predictions and inference.
>>> 
>>> classifier('We are very happy to include pipeline into the transformers repository.')
[{'label': 'POSITIVE', 'score': 0.9978193640708923}]
>>> 

##別のサンプルコードも実行成功:sunny:

Pypi Transformers公式ドキュメントにある別のサンプルコード

サンプルコード
>>> from transformers import pipeline

# Allocate a pipeline for question-answering
>>> question_answerer = pipeline('question-answering')
>>> question_answerer({
...     'question': 'What is the name of the repository ?',
...     'context': 'Pipeline have been included in the huggingface/transformers repository'
... })
{'score': 0.5135612454720828, 'start': 35, 'end': 59, 'answer': 'huggingface/transformers'}

####実行成功

```Python:`Python3.6.3

from transformers import pipeline
question_answerer = pipeline('question-answering')
Downloading: 100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 473/473 [00:00<00:00, 302kB/s]
Downloading: 100%|██████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 261M/261M [00:16<00:00, 15.7MB/s]

( ・・・省略・・・ )

You should probably TRAIN this model on a down-stream task to be able to use it for predictions and inference.

question_answerer({
... 'question': 'What is the name of the repository ?',
... 'context': 'Pipeline have been included in the huggingface/transformers repository'
... })
{'score': 0.5135963559150696, 'start': 35, 'end': 59, 'answer': 'huggingface/transformers'}


2
1
0

Register as a new user and use Qiita more conveniently

  1. You get articles that match your needs
  2. You can efficiently read back useful information
  3. You can use dark theme
What you can do with signing up
2
1

Delete article

Deleted articles cannot be recovered.

Draft of this article would be also deleted.

Are you sure you want to delete this article?