0
1

Delete article

Deleted articles cannot be recovered.

Draft of this article would be also deleted.

Are you sure you want to delete this article?

低スペックの Ubuntu 上で Ollama に gpt-oss:20b をインストールしようとしたら失敗した(それはそう

Last updated at Posted at 2025-08-23

Vultrに構築したUbuntu 24.04の環境に Ollama をセットアップして gpt-oss:20b をインストールしようとしたら失敗したのでその記録です。


Vultr上のUbuntuのVMへログインします。

$ ssh root@203.0.113.1 -i ~/.ssh/vultr用の秘密鍵のパス

作業ディレクトリへ移動します。(今回は「/tmp」とします)

root@myfirstinstance:~# cd /tmp

Ollama を Linux にインストールする場合の手順に従いコマンドを実行します。
GPUが利用できない環境なので「Ollama will run in CPU-only mode.」と警告が表示されます。

root@myfirstinstance:/tmp# curl -fsSL https://ollama.com/install.sh | sh
>>> Installing ollama to /usr/local
>>> Downloading Linux amd64 bundle
######################################################################## 100.0%
>>> Creating ollama user...
>>> Adding ollama user to render group...
>>> Adding ollama user to video group...
>>> Adding current user to ollama group...
>>> Creating ollama systemd service...
>>> Enabling and starting ollama service...
Created symlink /etc/systemd/system/default.target.wants/ollama.service → /etc/systemd/system/ollama.service.
>>> The Ollama API is now available at 127.0.0.1:11434.
>>> Install complete. Run "ollama" from the command line.
WARNING: No NVIDIA/AMD GPU detected. Ollama will run in CPU-only mode.

Ollama のバージョンを表示してインストールが成功していることを確認します。

root@myfirstinstance:/tmp# ollama --version
ollama version is 0.11.3

Ollama のサービスが起動していることを確認します。

root@myfirstinstance:/tmp# systemctl status ollama.service
● ollama.service - Ollama Service
     Loaded: loaded (/etc/systemd/system/ollama.service; enabled; preset: enabled)
     Active: active (running) since Fri 2025-08-08 00:23:29 UTC; 9min ago
   Main PID: 1654 (ollama)
      Tasks: 6 (limit: 1057)
     Memory: 9.9M (peak: 10.2M)
        CPU: 42ms
     CGroup: /system.slice/ollama.service
             └─1654 /usr/local/bin/ollama serve

Aug 08 00:23:29 myfirstinstance ollama[1654]: Your new public key is:
Aug 08 00:23:29 myfirstinstance ollama[1654]: ssh-ed25519 XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
Aug 08 00:23:29 myfirstinstance ollama[1654]: time=2025-08-08T00:23:29.127Z level=INFO source=routes.go:1297 msg="server config" env="map[CUDA_VISIBLE_DEVICES: GPU_DEVICE_ORDINAL: HIP_VISIBLE_DEVICES: HSA_OVERRIDE_GFX_VERSION: HTTPS_PROXY: HTTP_PROXY: NO_PROXY: OLLAMA_CONTEXT_LENGTH:4096 OLLAMA_DEBUG:INFO OLLAMA_FLASH_ATTENTION:false OLLAMA_GPU_OVERHEAD:0 OLLAMA_HOST:http://127.0.0.1:11434 OLLAMA_INTEL_GPU:false OLLAMA_KEEP_ALIVE:5m0s OLLAMA_KV_CACHE_TYPE: OLLAMA_LLM_LIBRARY: OLLAMA_LOAD_TIMEOUT:5m0s OLLAMA_MAX_LOADED_MODELS:0 OLLAMA_MAX_QUEUE:512 OLLAMA_MODELS:/usr/share/ollama/.ollama/models OLLAMA_MULTIUSER_CACHE:false OLLAMA_NEW_ENGINE:false OLLAMA_NOHISTORY:false OLLAMA_NOPRUNE:false OLLAMA_NUM_PARALLEL:1 OLLAMA_ORIGINS:[http://localhost https://localhost http://localhost:* https://localhost:* http://127.0.0.1 https://127.0.0.1 http://127.0.0.1:* https://127.0.0.1:* http://0.0.0.0 https://0.0.0.0 http://0.0.0.0:* https://0.0.0.0:* app://* file://* tauri://* vscode-webview://* vscode-file://*] OLLAMA_SCHED_SPREAD:false ROCR_VISIBLE_DEVICES: http_proxy: https_proxy: no_proxy:]"
Aug 08 00:23:29 myfirstinstance ollama[1654]: time=2025-08-08T00:23:29.128Z level=INFO source=images.go:477 msg="total blobs: 0"
Aug 08 00:23:29 myfirstinstance ollama[1654]: time=2025-08-08T00:23:29.128Z level=INFO source=images.go:484 msg="total unused blobs removed: 0"
Aug 08 00:23:29 myfirstinstance ollama[1654]: time=2025-08-08T00:23:29.128Z level=INFO source=routes.go:1350 msg="Listening on 127.0.0.1:11434 (version 0.11.3)"
Aug 08 00:23:29 myfirstinstance ollama[1654]: time=2025-08-08T00:23:29.130Z level=INFO source=gpu.go:217 msg="looking for compatible GPUs"
Aug 08 00:23:29 myfirstinstance ollama[1654]: time=2025-08-08T00:23:29.139Z level=INFO source=gpu.go:377 msg="no compatible GPUs were discovered"
Aug 08 00:23:29 myfirstinstance ollama[1654]: time=2025-08-08T00:23:29.139Z level=INFO source=types.go:130 msg="inference compute" id=0 library=cpu variant="" compute="" driver=0.0 name="" total="955.8 MiB" available="479.6 MiB"
Aug 08 00:27:23 myfirstinstance ollama[1654]: [GIN] 2025/08/08 - 00:27:23 | 200 |     187.291µs |       127.0.0.1 | GET      "/api/version"

gpt-oss:20b をインストールしてみると、「model requires more system memory (12.0 GiB)」と表示され失敗しました。

root@myfirstinstance:/tmp# ollama run gpt-oss:20b --verbose
pulling manifest 
pulling b112e727c6f1: 100% ▕████████████████████████████████████████████████▏  13 GB                         
pulling 51468a0fd901: 100% ▕████████████████████████████████████████████████▏ 7.4 KB                         
pulling f60356777647: 100% ▕████████████████████████████████████████████████▏  11 KB                         
pulling d8ba2f9a17b3: 100% ▕████████████████████████████████████████████████▏   18 B                         
pulling 8d6fddaf04b2: 100% ▕████████████████████████████████████████████████▏  489 B                         
verifying sha256 digest 
writing manifest 
success 
Error: 500 Internal Server Error: model requires more system memory (12.0 GiB) than is available (2.9 GiB)

それはそう。

root@myfirstinstance:~# grep -E '(MemTotal|MemFree|MemAvailable)' /proc/meminfo 
MemTotal:         978752 kB
MemFree:          473460 kB
MemAvailable:     639740 kB
0
1
0

Register as a new user and use Qiita more conveniently

  1. You get articles that match your needs
  2. You can efficiently read back useful information
  3. You can use dark theme
What you can do with signing up
0
1

Delete article

Deleted articles cannot be recovered.

Draft of this article would be also deleted.

Are you sure you want to delete this article?