1
0

Delete article

Deleted articles cannot be recovered.

Draft of this article would be also deleted.

Are you sure you want to delete this article?

Llama‑Cppで発生する NoneType is not callable 例外の原因と対処法

Posted at

結論

処理の最後にモデルをクローズする処理を追加する

model.client.close()

状況

Llama‑Cppを用いたLLMを利用するときに、処理は成功し適切に終了しますが、終了時に次のようなエラーが出力されます。

Exception ignored in: <function Llama.__del__ at 0x107670b80>
Traceback (most recent call last):
  File "/Users/kengo/Project/Dev/llm-practice/.venv/lib/python3.13/site-packages/llama_cpp/llama.py", line 2209, in __del__
  File "/Users/kengo/Project/Dev/llm-practice/.venv/lib/python3.13/site-packages/llama_cpp/llama.py", line 2206, in close
  File "/opt/homebrew/Cellar/python@3.13/3.13.3/Frameworks/Python.framework/Versions/3.13/lib/python3.13/contextlib.py", line 627, in close
  File "/opt/homebrew/Cellar/python@3.13/3.13.3/Frameworks/Python.framework/Versions/3.13/lib/python3.13/contextlib.py", line 619, in __exit__
  File "/opt/homebrew/Cellar/python@3.13/3.13.3/Frameworks/Python.framework/Versions/3.13/lib/python3.13/contextlib.py", line 604, in __exit__
  File "/opt/homebrew/Cellar/python@3.13/3.13.3/Frameworks/Python.framework/Versions/3.13/lib/python3.13/contextlib.py", line 364, in __exit__
  File "/Users/kengo/Project/Dev/llm-practice/.venv/lib/python3.13/site-packages/llama_cpp/_internals.py", line 83, in close
  File "/opt/homebrew/Cellar/python@3.13/3.13.3/Frameworks/Python.framework/Versions/3.13/lib/python3.13/contextlib.py", line 627, in close
  File "/opt/homebrew/Cellar/python@3.13/3.13.3/Frameworks/Python.framework/Versions/3.13/lib/python3.13/contextlib.py", line 619, in __exit__
  File "/opt/homebrew/Cellar/python@3.13/3.13.3/Frameworks/Python.framework/Versions/3.13/lib/python3.13/contextlib.py", line 604, in __exit__
  File "/opt/homebrew/Cellar/python@3.13/3.13.3/Frameworks/Python.framework/Versions/3.13/lib/python3.13/contextlib.py", line 482, in _exit_wrapper
  File "/Users/kengo/Project/Dev/llm-practice/.venv/lib/python3.13/site-packages/llama_cpp/_internals.py", line 72, in free_model
TypeError: 'NoneType' object is not callable

このエラーは Llama オブジェクトの終了処理(__ del __)中に、すでに解放されたリソースを再度解放しようとして失敗したことによって出ています。

model = LlamaCpp(
            model_path=model_path,
            temperature=0.5,
            ...
        )
何らかの処理

原因と解決方法

llama-cpp-pythonのissueに解決方法が投稿されていました。

llama_cppPythonインタープリタのシャットダウン時に、モデルオブジェクトよりも先にモジュールが破棄されるため、llama_cppPythonインタープリタがメソッドを呼び出してもにアクセスできない

この問題を回避するには、Pythonインタープリタがシャットダウンする前にモデルを明示的に閉じる必要があります。

1
0
0

Register as a new user and use Qiita more conveniently

  1. You get articles that match your needs
  2. You can efficiently read back useful information
  3. You can use dark theme
What you can do with signing up
1
0

Delete article

Deleted articles cannot be recovered.

Draft of this article would be also deleted.

Are you sure you want to delete this article?