1
0

Delete article

Deleted articles cannot be recovered.

Draft of this article would be also deleted.

Are you sure you want to delete this article?

Llama cliを使う

Last updated at Posted at 2024-09-18

概要

llama3.1モデルを使うために必要なツールをセットアップする

【参考】 Meta llama-3 8B へのリンク

https://huggingface.co/meta-llama/Meta-Llama-3-8B

やってみる

0. 準備

インストールする場所

適当な場所にllama-cliをインストールしてみます。
今回はllama-cliというディレクトリを作って、そちらにインストールしてみました

$ mkdir -p llama-cli

venvを設定する

$ cd llama-cli
$ python3 -m venv venv
$ . ./venv/bin/activate
エラーが出たら

以下をインストールしてください

$ pip install setuptools

1. llama-cliをインストール

$ pip install llama-toolchain

2. llama-cliの動作を確認

表示内容は2024/09/19時点のもの

$ llama model list --show-all
+---------------------------------------+---------------------------------------------+----------------+
| Model Descriptor                      | HuggingFace Repo                            | Context Length |
+---------------------------------------+---------------------------------------------+----------------+
| Llama-2-7b                            | meta-llama/Llama-2-7b                       | 4K             |
+---------------------------------------+---------------------------------------------+----------------+
| Llama-2-13b                           | meta-llama/Llama-2-13b                      | 4K             |
+---------------------------------------+---------------------------------------------+----------------+
| Llama-2-70b                           | meta-llama/Llama-2-70b                      | 4K             |
+---------------------------------------+---------------------------------------------+----------------+
| Llama-2-7b-chat                       | meta-llama/Llama-2-7b-chat                  | 4K             |
+---------------------------------------+---------------------------------------------+----------------+
| Llama-2-13b-chat                      | meta-llama/Llama-2-13b-chat                 | 4K             |
+---------------------------------------+---------------------------------------------+----------------+
| Llama-2-70b-chat                      | meta-llama/Llama-2-70b-chat                 | 4K             |
+---------------------------------------+---------------------------------------------+----------------+
| Llama-3-8B                            | meta-llama/Meta-Llama-3-8B                  | 8K             |
+---------------------------------------+---------------------------------------------+----------------+
| Llama-3-70B                           | meta-llama/Meta-Llama-3-70B                 | 8K             |
+---------------------------------------+---------------------------------------------+----------------+
| Llama-3-8B-Instruct                   | meta-llama/Meta-Llama-3-8B-Instruct         | 8K             |
+---------------------------------------+---------------------------------------------+----------------+
| Llama-3-70B-Instruct                  | meta-llama/Meta-Llama-3-70B-Instruct        | 8K             |
+---------------------------------------+---------------------------------------------+----------------+
| Meta-Llama3.1-8B                      | meta-llama/Meta-Llama-3.1-8B                | 128K           |
+---------------------------------------+---------------------------------------------+----------------+
| Meta-Llama3.1-70B                     | meta-llama/Meta-Llama-3.1-70B               | 128K           |
+---------------------------------------+---------------------------------------------+----------------+
| Meta-Llama3.1-405B:bf16-mp8           | meta-llama/Meta-Llama-3.1-405B              | 128K           |
+---------------------------------------+---------------------------------------------+----------------+
| Meta-Llama3.1-405B                    | meta-llama/Meta-Llama-3.1-405B-FP8          | 128K           |
+---------------------------------------+---------------------------------------------+----------------+
| Meta-Llama3.1-405B:bf16-mp16          | meta-llama/Meta-Llama-3.1-405B              | 128K           |
+---------------------------------------+---------------------------------------------+----------------+
| Meta-Llama3.1-8B-Instruct             | meta-llama/Meta-Llama-3.1-8B-Instruct       | 128K           |
+---------------------------------------+---------------------------------------------+----------------+
| Meta-Llama3.1-70B-Instruct            | meta-llama/Meta-Llama-3.1-70B-Instruct      | 128K           |
+---------------------------------------+---------------------------------------------+----------------+
| Meta-Llama3.1-405B-Instruct:bf16-mp8  | meta-llama/Meta-Llama-3.1-405B-Instruct     | 128K           |
+---------------------------------------+---------------------------------------------+----------------+
| Meta-Llama3.1-405B-Instruct           | meta-llama/Meta-Llama-3.1-405B-Instruct-FP8 | 128K           |
+---------------------------------------+---------------------------------------------+----------------+
| Meta-Llama3.1-405B-Instruct:bf16-mp16 | meta-llama/Meta-Llama-3.1-405B-Instruct     | 128K           |
+---------------------------------------+---------------------------------------------+----------------+
| Llama-Guard-3-8B                      | meta-llama/Llama-Guard-3-8B                 | 128K           |
+---------------------------------------+---------------------------------------------+----------------+
| Llama-Guard-3-8B:int8-mp1             | meta-llama/Llama-Guard-3-8B-INT8            | 128K           |
+---------------------------------------+---------------------------------------------+----------------+
| Prompt-Guard-86M                      | meta-llama/Prompt-Guard-86M                 | 128K           |
+---------------------------------------+---------------------------------------------+----------------+
| Llama-Guard-2-8B                      | meta-llama/Meta-Llama-Guard-2-8B            | 4K             |
+---------------------------------------+---------------------------------------------+----------------+

3. Meta-Llama3.1-8B をダウンロードしてみる

$ llama download --source meta --model-id Meta-Llama3.1-8B

Please provide the signed URL you received via email (e.g., https://llama3-1.llamameta.net/*?Policy...):  <---ここでメールにあったSignedURLを入力
...
Downloading `checklist.chk`...
Downloading `/Users/nag/.llama/checkpoints/Meta-Llama3.1-8B/checklist.chk`....{'Range': 'bytes=0-150'}
Progress: |██████████████████████████████████████████████████| 100.00% (0/0 MB) Speed: 0.00 MiB/s
Finished downloading `/Users/nag/.llama/checkpoints/Meta-Llama3.1-8B/checklist.chk`....
Downloading `tokenizer.model`...
Downloading `/Users/nag/.llama/checkpoints/Meta-Llama3.1-8B/tokenizer.model`....{'Range': 'bytes=0-2183982'}
Progress: |██████████████████████████████████████████████████| 100.00% (2/2 MB) Speed: 3.94 MiB/s
Finished downloading `/Users/nag/.llama/checkpoints/Meta-Llama3.1-8B/tokenizer.model`....
Downloading `params.json`...
Downloading `/Users/nag/.llama/checkpoints/Meta-Llama3.1-8B/params.json`....{'Range': 'bytes=0-199'}
Progress: |██████████████████████████████████████████████████| 100.00% (0/0 MB) Speed: 0.00 MiB/s
Finished downloading `/Users/nag/.llama/checkpoints/Meta-Llama3.1-8B/params.json`....
Downloading `consolidated.00.pth`...
Downloading `/Users/nag/.llama/checkpoints/Meta-Llama3.1-8B/consolidated.00.pth`....{'Range': 'bytes=0-16060621688'}
Progress: |██████████████████████████------------------------| 52.52% (8044/15316 MB) Speed: 26.16 MiB/s

最後のファイルは15GBなんですが、結構時間がかかりました

4. どこに保存される?

homeディレクトリの直下に保存されています

$ % ls -la ~/.llama/checkpoints/Meta-Llama3.1-8B/

total 31372792
drwxr-xr-x  10 nag  staff   320B  9 18 22:31 .
drwxr-xr-x   3 nag  staff    96B  9 18 22:27 ..
drwxr-xr-x   3 nag  staff    96B  9 18 22:27 .cache
-rw-r--r--   1 nag  staff   7.4K  9 18 22:27 LICENSE
-rw-r--r--   1 nag  staff    40K  9 18 22:27 README.md
-rw-r--r--   1 nag  staff   150B  9 18 22:31 checklist.chk
-rw-r--r--   1 nag  staff    15G  9 18 22:41 consolidated.00.pth
drwxr-xr-x   2 nag  staff    64B  9 18 22:27 original
-rw-r--r--   1 nag  staff   199B  9 18 22:31 params.json
-rw-r--r--   1 nag  staff   2.1M  9 18 22:31 tokenizer.model

補足

以下のエラーが出たら

setuptool が入っていないというメッセージです

$ llama

Traceback (most recent call last):
  File "/Users/nag/Desktop/LLM-Zoo/venv/bin/llama", line 5, in <module>
    from llama_toolchain.cli.llama import main
  File "/Users/nag/Desktop/LLM-Zoo/venv/lib/python3.12/site-packages/llama_toolchain/cli/llama.py", line 11, in <module>
    from .stack import StackParser
  File "/Users/nag/Desktop/LLM-Zoo/venv/lib/python3.12/site-packages/llama_toolchain/cli/stack/__init__.py", line 7, in <module>
    from .stack import StackParser  # noqa
    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/nag/Desktop/LLM-Zoo/venv/lib/python3.12/site-packages/llama_toolchain/cli/stack/stack.py", line 12, in <module>
    from .configure import StackConfigure
  File "/Users/nag/Desktop/LLM-Zoo/venv/lib/python3.12/site-packages/llama_toolchain/cli/stack/configure.py", line 11, in <module>
    import pkg_resources
ModuleNotFoundError: No module named 'pkg_resources'
...
1
0
0

Register as a new user and use Qiita more conveniently

  1. You get articles that match your needs
  2. You can efficiently read back useful information
  3. You can use dark theme
What you can do with signing up
1
0

Delete article

Deleted articles cannot be recovered.

Draft of this article would be also deleted.

Are you sure you want to delete this article?