3
1

Delete article

Deleted articles cannot be recovered.

Draft of this article would be also deleted.

Are you sure you want to delete this article?

R言語Advent Calendar 2024

Day 13

tidyverse/elmer パッケージでLLMを利用

Last updated at Posted at 2024-12-12

はじめに

Rの研究集会2024で、「RでローカルLLM」と題して発表させて頂きました。rollamaというパッケージを使ってRからollamaモデルを使ってみたという内容ですが、発表後にelmerというパッケージもあるよと教えていただきました。elmerで探してもとくにヒットしなかったので今回調べて記事を書いてみました。
結論を先に書くと期待した動作をしなかったのですが、これは私の操作ミスの可能性があり、その場合はご指摘頂けますと幸甚です。
本件操作を間違えていたようです。

tidyverse/elmer パッケージ

本パッケージはこちらで公開されています。
リポジトリにtidyverseと記されておりつい期待してしまいました。

インストール

ドキュメント通りに以下にて実施しました。

install.packages("pak")
pak::pak("tidyverse/elmer")

とくに問題なくインストールできたようです。

> library(elmer)
> sessionInfo()
R version 4.4.0 (2024-04-24)
Platform: x86_64-pc-linux-gnu
Running under: Ubuntu 24.04.1 LTS

Matrix products: default
BLAS:   /usr/lib/x86_64-linux-gnu/blas/libblas.so.3.12.0 
LAPACK: /usr/lib/x86_64-linux-gnu/lapack/liblapack.so.3.12.0

locale:
 [1] LC_CTYPE=ja_JP.UTF-8       LC_NUMERIC=C              
 [3] LC_TIME=ja_JP.UTF-8        LC_COLLATE=ja_JP.UTF-8    
 [5] LC_MONETARY=ja_JP.UTF-8    LC_MESSAGES=ja_JP.UTF-8   
 [7] LC_PAPER=ja_JP.UTF-8       LC_NAME=C                 
 [9] LC_ADDRESS=C               LC_TELEPHONE=C            
[11] LC_MEASUREMENT=ja_JP.UTF-8 LC_IDENTIFICATION=C       

time zone: Asia/Tokyo
tzcode source: system (glibc)

attached base packages:
[1] stats     graphics  grDevices utils     datasets  methods   base     

other attached packages:
[1] elmer_0.0.0.9000

loaded via a namespace (and not attached):
 [1] compiler_4.4.0  magrittr_2.0.3  R6_2.5.1        cli_3.6.3      
 [5] glue_1.8.0      httr2_1.0.7     rappdirs_0.3.3  coro_1.1.0     
 [9] S7_0.2.0        lifecycle_1.0.4 rlang_1.1.4   

使ってみた ... けど何か変 ...

オンラインヘルプを表示させましたが、ollama以外にもOpenAI社のGPTを始めとする商用の多くのLLMに対応しているようです。

> library(help=elmer)
(中略)
chat_azure              Chat with a model hosted on Azure OpenAI
chat_bedrock            Chat with an AWS bedrock model
chat_claude             Chat with an Anthropic Claude model
chat_cortex             Create a chatbot that speaks to the Snowflake
                        Cortex Analyst
chat_databricks         Chat with a model hosted on Databricks
chat_gemini             Chat with a Google Gemini model
chat_github             Chat with a model hosted on the GitHub model
                        marketplace
chat_groq               Chat with a model hosted on Groq
chat_ollama             Chat with a local Ollama model
chat_openai             Chat with an OpenAI model
chat_perplexity         Chat with a model hosted on perplexity.ai
(以降略)

rollamaとの比較ということで、chat_ollamaを使ってみました。

Usage:

     chat_ollama(
       system_prompt = NULL,
       turns = NULL,
       base_url = "http://localhost:11434",
       model,
       seed = NULL,
       api_args = list(),
       echo = NULL
     )

とあったので、ローカルホストでollamaが動作しているのであれば、モデルとプロンプト与えればとりあえず動作すると理解しました。指定するmodelですが、

> chat_ollama
function (system_prompt = NULL, turns = NULL, base_url = "http://localhost:11434", 
    model, seed = NULL, api_args = list(), echo = NULL) 
{
    if (!has_ollama(base_url)) {
        cli::cli_abort("Can't find locally running ollama.")
    }
    if (missing(model)) {
        models <- ollama_models(base_url)
        cli::cli_abort(c("Must specify {.arg model}.", i = "Locally installed models: {.str {models}}."))
    }
    chat_openai(system_prompt = system_prompt, turns = turns, 
        base_url = file.path(base_url, "v1"), api_key = "ollama", 
        model = model, seed = seed, api_args = api_args, echo = echo)
}
<bytecode: 0x5ef0f3d00de8>
<environment: namespace:elmer>

とあるので、ollama_models()で表示できると知りました。以下著者の環境での実施例になります。

> elmer:::ollama_models()
 [1] "marco-o1"                     "nemotron-mini"               
 [3] "meditron"                     "llama3.2-vision"             
 [5] "x/llama3.2-vision"            "qwen2.5-coder"               
 [7] "llama3.2"                     "7shi/tanuki-dpo-v1.0:8b-q6_K"
 [9] "command-r"                    "tanuki"                      
[11] "longwriter-llama3.1"          "gemma2"                      
[13] "gemma2-2b-ezo"                "gemma2:2b"                   
[15] "llama-3.1-ezo"                "llamma3.1-jp"                
[17] "llama3.1"                     "llava"                       
[19] "medllama2"                    "llava-llama3"                
[21] "llama3"                       "phi3"                        
[23] "mistral"                      "openchat" 

ちなみに、tanuki-dpo-v1.0:8b-q6_Kというのは今年GENIAC松尾研 LLM開発プロジェクト
で開発されたLLMを少し強引にollamaで動作するようしたモデルになります。これが表示されるということはちゃんとollamaと通信できていることがわかります。
ここで、openchatモデルを指定して自己紹介させてみました。

> chat_ollama("自己紹介", model="openchat")
<Chat turns=1 tokens=0/0>
── system ──────────────────────────────────────────────────────────────────────
自己紹介

あれ? w
ollamaへの問い合わせは、筆者の環境だけかもしれませんがうまくいきませんでした。

ここですが、以下のようにすると出力しました。

> chat <- chat_ollama(
  model="llama3.2-vision", 
   system_prompt = "You are a friendly but terse assistant.",
)
> live_console(chat)
╔══════════════════════════════════════════════════════╗
║ Entering chat console. Use """ for multi-line input. ║
║ Press Ctrl+C to quit.                                ║
╚══════════════════════════════════════════════════════╝
>>> Who were the original creators of R?
R (programming language) was created by Ross Ihaka in 1991 at the University of Auckland in New Zealand. The development was 
supported and later continued by Robert Gentleman, also at the university.

そこでgithubのドキュメントの記述を試してみました。gpt-4o-miniへの問い合わせは問題なく動作するようです。
gpt-4o-miniへの問い合わせも問題なく動作するようです。

> chat <- chat_openai(
  model = "gpt-4o-mini",
  system_prompt = "You are a friendly but terse assistant.",
)
> live_console(chat)
╔══════════════════════════════════════════════════════╗
║ Entering chat console. Use """ for multi-line input. ║
║ Press Ctrl+C to quit.                                ║
╚══════════════════════════════════════════════════════╝
>>> Who were the original creators of R?
R was originally created by Ross Ihaka and Robert Gentleman at the University 
of Auckland in New Zealand. It was developed in the mid-1990s as an open-source
alternative to the S programming language.

別の例を行ってみましょう。

>>> chat$chat(
  content_image_url("https://www.r-project.org/Rlogo.png"),
  "Can you explain this logo?"
)
It seems like you might be trying to use a command or function. Could you 
clarify what you need assistance with?

It looks like you're trying to display an image using a function that retrieves
content from a URL. If you're working with R, you might want to use the 
`magick` package or similar tools. However, to properly assist you, I need to 
know if you're looking to display the image in R or another context. Please 
provide a bit more detail!

The R logo is a stylized letter "R" in a light blue color, often presented with
a circular design that symbolizes the open-source nature of the R programming 
language. The logo reflects R's focus on statistics, data analysis, and data 
visualization. The design emphasizes clarity and simplicity, aligning with the 
language's aim to provide accessible tools for data manipulation and graphics. 
Additionally, the curved shape suggests a sense of flexibility and 
responsiveness in handling data. Overall, the logo represents a modern and 
approachable computing environment for statisticians and data scientists.

It looks like you entered a closing parenthesis. If you have a specific 
question or need help with something else, feel free to ask!

OpenAIのGPT-4o-miniでは問題なく動作するようですね。

感想

Ollamaで動かなかったのは少し残念ですが、本パッケージは今後様々な場面で使われていくと思います。
Qiita初記事でした。

Enjoy!!

3
1
0

Register as a new user and use Qiita more conveniently

  1. You get articles that match your needs
  2. You can efficiently read back useful information
  3. You can use dark theme
What you can do with signing up
3
1

Delete article

Deleted articles cannot be recovered.

Draft of this article would be also deleted.

Are you sure you want to delete this article?