LoginSignup
4
4

More than 5 years have passed since last update.

embulkでwowzaのログをelasticsearchに流し込む為のconfigメモ

Last updated at Posted at 2017-02-16

wowzaのログコンフィグは、デフォルトの場合の想定です。

log4j.properties
# Access appender
log4j.appender.serverAccess.layout.Fields=date,time,tz,x-event,x-category,x-severity,x-status,x-ctx,x-comment,x-vhost,x-app,x-appinst,x-duration,s-ip,s-port,s-uri,c-ip,c-proto,c-referrer,c-user-agent,c-client-id,cs-bytes,sc-bytes,x-stream-id,x-spos,cs-stream-bytes,sc-stream-bytes,x-sname,x-sname-query,x-file-name,x-file-ext,x-file-size,x-file-length,x-suri,x-suri-stem,x-suri-query,cs-uri-stem,cs-uri-query

# Error appender
log4j.appender.serverError.layout.Fields=x-severity,x-category,x-event;date,time,c-client-id,c-ip,c-port,cs-bytes,sc-bytes,x-duration,x-sname,x-stream-id,x-spos,sc-stream-bytes,cs-stream-bytes,x-file-size,x-file-length,x-ctx,x-comment

config例

embulkで流し込むコンフィグは下記の通り。
キモは、日時フィールドの作成部分です。
wowzaのログは、日付と時刻を別のフィールドに出力するため、
それを結合する必要があります。

そのために

を利用します。
※プラグインのインストールは下記となります。

 embulk gem install embulk-filter-column embulk-filter-ruby_proc

アクセスログ

embulk-config.yml
in:
  type: file
  path_prefix: /path/to/wowza-access-log.txt
  parser:
    charset: UTF-8
    newline: LF
    type: csv
    delimiter: "\t"
    quote: '"'
    escape: '"'
    trim_if_not_quoted: false
    skip_header_lines: 0
    allow_extra_columns: false
    allow_optional_columns: false
    columns:
    - {name: date        , type: string}
    - {name: time        , type: string}
    - {name: tz          , type: string}
    - {name: x-event     , type: string}
    - {name: x-category  , type: string}
    - {name: x-severity  , type: string}
    - {name: x-status    , type: long}
    - {name: x-ctx       , type: string}
    - {name: x-comment   , type: string}
    - {name: x-vhost     , type: string}
    - {name: x-app       , type: string}
    - {name: x-appinst   , type: string}
    - {name: x-duration  , type: string}
    - {name: s-ip        , type: string}
    - {name: s-port      , type: string}
    - {name: s-uri       , type: string}
    - {name: c-ip        , type: string}
    - {name: c-proto     , type: string}
    - {name: c-referer   , type: string}
    - {name: c-user-agent, type: string}
    - {name: client-id   , type: string}
    - {name: cs-bytes    , type: string}
    - {name: sc-bytes    , type: string}
    - {name: x-stream-id , type: string}
    - {name: x-spos          , type: string}
    - {name: cs-stream-bytes , type: string}
    - {name: sc-stream-bytes , type: string}
    - {name: x-sname         , type: string}
    - {name: x-sname-query   , type: string}
    - {name: x-file-name     , type: string}
    - {name: x-file-ext      , type: string}
    - {name: x-file-size     , type: string}
    - {name: x-file-length   , type: string}
    - {name: x-suri          , type: string}
    - {name: x-suri-stem     , type: string}
    - {name: x-suri-queri    , type: string}
    - {name: cs-uri-stem     , type: string}
    - {name: cs-uri-query    , type: string}

filters:
  - type: column
    add_columns:
    - {name : datetime, type: timestamp, default: "1970-01-01", format: "%Y-%m-%d"}
  - type: ruby_proc
    before:
      - proc: |
          -> do
            require "time"
          end
    rows:
      - proc: |
          ->(record) do
            record.tap do |r|
              r["datetime"] = Time.iso8601(r["date"] + "T" + r["time"] + "+09:00")
            end
          end

  - type: column
    drop_columns:
    - {name : date}
    - {name : time}

out:
  type: elasticsearch_ruby
  mode: normal
  nodes:
  - {host: localhost, port: 9200}
  index: logstash-wowza
  index_type: wow

エラーログ

embulk-config-error.yml
in:
  type: file
  path_prefix: /path/to/wowza-error-log.txt
  parser:
    charset: UTF-8
    newline: LF
    type: csv
    delimiter: "\t"
    quote: '"'
    escape: '"'
    trim_if_not_quoted: false
    skip_header_lines: 0
    allow_extra_columns: false
    allow_optional_columns: false
    columns:
    - {name:  x-severity      , type: string}
    - {name:  x-category      , type: string}
    - {name:  x-event         , type: string}
    - {name:  date            , type: string}
    - {name:  time            , type: string}
    - {name:  c-client-id     , type: string}
    - {name:  c-ip            , type: string}
    - {name:  c-port          , type: string}
    - {name:  cs-bytes        , type: string}
    - {name:  sc-bytes        , type: string}
    - {name:  x-duration      , type: string}
    - {name:  x-sname         , type: string}
    - {name:  x-stream-id     , type: string}
    - {name:  x-spos          , type: string}
    - {name:  sc-stream-bytes , type: string}
    - {name:  cs-stream-bytes , type: string}
    - {name:  x-file-size     , type: string}
    - {name:  x-file-length   , type: string}
    - {name:  x-ctx           , type: string}
    - {name:  x-comment       , type: string}

filters:
  - type: column
    add_columns:
    - {name : datetime, type: timestamp, default: "1970-01-01", format: "%Y-%m-%d"}
  - type: ruby_proc
    before:
      - proc: |
          -> do
            require "time"
          end
    rows:
      - proc: |
          ->(record) do
            record.tap do |r|
              r["datetime"] = Time.iso8601(r["date"] + "T" + r["time"] + "+09:00")
            end
          end

  - type: column
    drop_columns:
    - {name : date}
    - {name : time}

out:
  type: elasticsearch_ruby
  mode: normal
  nodes:
  - {host: localhost, port: 9200}
  index: logstash-wowza-error-test
  index_type: wow
4
4
5

Register as a new user and use Qiita more conveniently

  1. You get articles that match your needs
  2. You can efficiently read back useful information
  3. You can use dark theme
What you can do with signing up
4
4