FluentdとElasticsearchとKibanaの組み合わせが流行ってる?みたいなのでdocker環境で作ってみました。
前提
- ログは行にJSON
- FluentdとElasticsearchとKibanaは同じサーバ上に存在
構成
.
├── docker-compose.yml
├── elasticsearch
│ └── Dockerfile
├── fluentd
│ ├── Dockerfile
│ ├── fluent.conf
│ └── plugins
└── log
└── hoge_log # 収集するログ
Docker
docker-compose.yml
elasticsearch:
build: elasticsearch
ports:
- 9200:9200
fluentd:
build: fluentd
ports:
- 24284:24284
volumes:
- ./log:/var/log/hoge
links:
- elasticsearch
kibana:
image: kibana
ports:
- 9204:5601
environment:
- ELASTICSEARCH_URL=http://elasticsearch:9200
links:
- elasticsearch
Fluentd
FROM fluent/fluentd:latest-onbuild
USER fluent
WORKDIR /home/fluent
ENV PATH /home/fluent/.gem/ruby/2.3.0/bin:$PATH
RUN gem install fluent-plugin-secure-forward # セキリュティ対策
RUN gem install fluent-plugin-elasticsearch # Elasticsearch連携
EXPOSE 24284
CMD fluentd -c /fluentd/etc/fluent.conf -p /fluentd/plugins -vv
fluent.conf
<source>
type tail
path /var/log/hoge/hoge_log
tag json.hoge
pos_file /var/log/hoge/hoge_log.pos
format json
</source>
<match json.**>
type copy
<store>
type elasticsearch
host elasticsearch
port 9200
logstash_format true
</store>
<store>
type stdout
</store>
</match>
Elasticsearch
FROM elasticsearch
RUN bin/plugin install mobz/elasticsearch-head
EXPOSE 9200
CMD ["bin/elasticsearch", "-Des.insecure.allow.root=true"] # root=trueが無いと初回起動で失敗する
起動
$ docker-compose up --build
Elasticsearch ページ
http://[IP]:9200/_plugin/head/
Kibanaページ
http://[IP]:9204/
こんな感じで試しにログを追加してみる
$ echo "{ hoge :"hoge" }" >> log/hoge_log
Elasticsearchにログが入っていればOK
github
今回のプロジェクトを公開しています。
https://github.com/naruminn/fluentd-elasticsearch-kibana