LoginSignup
20
23

More than 5 years have passed since last update.

Elasticsearch 自分用メモ

Last updated at Posted at 2016-01-26

初めてElasticsearchを使ってみるので、触りながらメモします。

用語整理

まずは用語から。MySQLと対応させるとこんな感じみたい。

MySQL Elasticsearch
database index
table type
record document

マッピング: リレーショナルDBでいうところのテーブル定義

マッピングの作成

$ curl -XPUT 'localhost:9200/<INDEX_NAME>' -d '
{
    "mappings" : {
      "<TYPE_NAME>" : {
        "properties" : {
          "author" : {
            "type" : "string"
          },
          "contents" : {
            "type" : "string",
            "analyzer": "japanese‎"
          },
          "enabled" : {
            "type" : "boolean"
          },
          "pub_date" : {
            "type" : "date",
            "format" : "dateOptionalTime"
          },
          "read_ratio" : {
            "type" : "double"
          },
          "reads" : {
            "type" : "long"
          },
          "subtitle" : {
            "type" : "string",
            "analyzer": "japanese‎"
          },
          "title" : {
            "type" : "string",
            "analyzer": "japanese‎"
          },
          "views" : {
            "type" : "long"
          }
        }
      }
    }
  }
}'

定義したJSONファイル(mapping.json)から作成する場合

$ curl -XPOST 192.168.33.10:9200/<INDEX_NAME> -d @mapping.json

mapping.json の中身
tokenizer に ngram を指定

{
  "settings": {
    "analysis": {
      "analyzer": {
        "ngram_analyzer": {
          "tokenizer": "ngram_tokenizer"
        }
      },
      "tokenizer": {
        "ngram_tokenizer": {
          "type": "nGram",
          "min_gram": "2",
          "max_gram": "3",
          "token_chars": [
            "letter",
            "digit"
          ]
        }
      }
    }
  },
  "mappings": {
    "items": {
      "properties": {
        "item_seq": {
          "type": "string"
        },
        "item_name": {
          "type": "string",
          "analyzer": "ngram_analyzer"
        },
        "group_seq": {
          "type": "string"
        }
      }
    }
  }
}

インデックス一覧の取得

$ curl -XGET 'localhost:9200/_aliases?pretty'

インデックスの削除

$ curl -X DELETE 'localhost:9200/<INDEX_NAME>'

プラグインの一覧

$ $ curl -X GET 'http://192.168.33.10:9200/_nodes/plugins?pretty'
{
  "cluster_name" : "elasticsearch",
  "nodes" : {
    "42fIIy3lQaGDGbnWL2Cydg" : {
      "name" : "Gwen Stacy",
      "transport_address" : "192.168.33.10:9300",
      "host" : "192.168.33.10",
      "ip" : "192.168.33.10",
      "version" : "2.1.1",
      "build" : "40e2c53",
      "http_address" : "192.168.33.10:9200",
      "plugins" : [ {
        "name" : "analysis-kuromoji",
        "version" : "2.1.1",
        "description" : "The Japanese (kuromoji) Analysis plugin integrates Lucene kuromoji analysis module into elasticsearch.",
        "jvm" : true,
        "classname" : "org.elasticsearch.plugin.analysis.kuromoji.AnalysisKuromojiPlugin",
        "isolated" : true,
        "site" : false
      } ]
    }
  }
}

検索

curl -XGET 'http://192.168.33.10:9200/<INDEX_NAME>/<TYPE_NAME>/_search?pretty=true' -d '
{
  "query" : {
    "simple_query_string" : {
      "query": "ほげ",
      "fields": ["_all"],
      "default_operator": "and"
    }
  }
}
'

一括登録 (PHP)

PHPからbulk load する方法

AWS の認証周り (PHP)

IAM使った時、elasticsearch のライブラリとどう組み合わせるのか分からん。

と思ったらこんなの発見。

Signing an Amazon Elasticsearch Service Search Request — AWS SDK for PHP documentation

AWS の認証周り(Python)

AWS ElasticSearch service · Issue #280 · elastic/elasticsearch-py

このへんで議論がされて、実装された模様。

Python Elasticsearch Client — Elasticsearch 2.2.0 documentation

こんな感じで使えるよって書いてあった。

from elasticsearch import Elasticsearch, RequestsHttpConnection
from requests_aws4auth import AWS4Auth

host = 'YOURHOST.us-east-1.es.amazonaws.com'
awsauth = AWS4Auth(YOUR_ACCESS_KEY, YOUR_SECRET_KEY, REGION, 'es')

es = Elasticsearch(
    hosts=[{'host': host, 'port': 443}],
    http_auth=awsauth,
    use_ssl=True,
    verify_certs=True,
    connection_class=RequestsHttpConnection
)
print(es.info())

AWS の認証周り(番外編)

Proxyがあった。
https://github.com/coreos/aws-auth-proxy

20
23
1

Register as a new user and use Qiita more conveniently

  1. You get articles that match your needs
  2. You can efficiently read back useful information
  3. You can use dark theme
What you can do with signing up
20
23