How to build an ELK log system based on Docker

How to build an ELK log system based on Docker

Background requirements:

As the business grows larger and larger, there are more and more servers, and the amount of various access logs, application logs, and error logs increases. As a result, operation and maintenance personnel cannot manage logs well. Developers need to check logs on the server to troubleshoot problems, which is inconvenient for operation personnel. If they need some data, we need to operate and maintain the server to analyze the logs.

ELK Introduction:

insert image description here

ELK is the abbreviation of three open source software: Elasticsearch, Logstash and Kibana, all of which are open source software. Currently, the original ELK Stack has been renamed Elastic Stack due to the addition of the Beats tool. Beats is a lightweight log collection and processing tool (Agent) that takes up few resources and is suitable for collecting logs on various servers and transmitting them to Logstash. The official also recommends this tool.

insert image description here

Building steps:

After the above description, everyone should have a little understanding that to build this platform, at least 3/4 software is needed (filebeat is not required).

  • kibana is used for display
  • elasticsearch for retrieval
  • logstash is used for filtering
  • Filebeat is used to collect logs

This article assumes that you already have a Docker environment and have basic experience using Docker.

Pull the image (unified version can avoid many pitfalls):

docker pull kibana:6.8.2
docker pull elasticsearch:6.8.2
docker pull mobz/elasticsearch-head:5 # a plugin for es docker pull logstash:6.8.2
docker pull docker.elastic.co/beats/filebeat:6.8.2 # If it is very slow, you can try to set up a proxy or change the docker image source

To set up a Docker proxy in Linux, you can use ClashX to subscribe to the URL, and then connect Docker to the proxy. If you have a desktop, you can change it directly. If you don't have one, you can create a configuration file to change it. ⇒ Portal

insert image description here

Also give Docker more memory to avoid lags.

Build ES:

docker run -d -p 9200:9200 -p 9300:9300 --name elasticsearch -e "discovery.type=single-node" elasticsearch:6.8.2

Browser access (curl is also OK): localhost:9200. If the following result appears, it means the startup is successful:

insert image description here

If an error occurs, check the error log.

Create ES-HEAD:

docker run -d -p 9100:9100 docker.io/mobz/elasticsearch-head:5

Visit localhost:9100 again and you will see the following result:

insert image description here

You may not be able to connect to es in es-head because es does not have cross-domain enabled. Enter the es container and find elasticsearch.yml under the config file. Add the following to this file:

http.cors.enabled: true
http.cors.allow-origin: "*"

Then restart es and es-head can connect to es.

Open Kibana:

docker run -d -p 5601:5601 --link elasticsearch -e ELASTICSEARCH_URL=http://elasticsearch:9200 kibana:6.8.2

Here it is http://elasticsearch:9200. Do not change it. Previously, we used --link to add the elasticsearch IP address to the hosts file of the kibana container, so that we can access the es service directly through the defined name.
After the container is started successfully, you should be able to see the following information in es-head:

insert image description here

Visiting localhost:5601 will result in the following:

insert image description here

So far, our kibana has been running successfully, and es has also been running. The next step is to collect log services.

Build filebeat and logstash:

First of all, these two brothers need some configuration files. We want to store these two configuration files in the same folder:

mkdir elktest # In the main directory, that is, ~/elktest path cd elktest
touch filebeat.yml
touch logstash.conf

Add a profile

vim filebeat.yml
filebeat.prospectors:
- paths:
    - /home/elk/logs/user/a.log
  multiline:
      pattern: ^\d{4}
      negate: true
      match: after
  fields:
    doc_type: user
- paths:
    - /home/elk/logs/service/a.log
  multiline:
      pattern: ^\d{4}
      negate: true
      match: after
  fields:
    doc_type: service
output.logstash: # Output address hosts: ["logstash:5044"]
vim logstash.conf
input {
  beats {
    port => "5044"
  }
}
filter {
  json {
    source => "message"
  }
}

output {
  stdout { codec => rubydebug }
  elasticsearch
        hosts => [ "elasticsearch:9200" ]
        index => "%{[fields][doc_type]}-%{+YYYY.MM.dd}"
    }
}

You should still be in the elktest directory. Since we don't have a log source, this video will create a log manually and create a log folder:

mkdir logdir

Build some logs yourself, write whatever you want, don't change the file and folder names, they are already mapped in the configuration file. After the build is complete, your directory structure should look like this:

insert image description here

Create a container:

docker run -it --name logstash --link elasticsearch -d -p 5044:5044 -v ~/elktest/logstash.conf:/usr/share/logstash/pipeline/logstash.conf logstash:6.8.2

docker run --name filebeat --link logstash -d -v ~/elktest/filebeat.yml:/usr/share/filebeat/filebeat.yml -v ~/elktest/logdir/user/:/home/elk/logs/user/ -v ~/elktest/logdir/service/:/home/elk/logs/service/ docker.elastic.co/beats/filebeat:6.8.2

At this point your es-head should look like this:

insert image description here

If this is not the case, you can check whether the container is running and whether there is an error log. You can also ping elasticsearch in the logstash container and ping logstash in the filebeat container to check whether there is a problem with the path.

Kibana shows: Create index:

insert image description here

After the creation is complete, check it and add the fields you want to view. The content is in the message (which happens to be the content in a.log):

insert image description here

So far, our elk has been built. Friends who have successfully built it can give themselves a chicken drumstick for dinner! ! ! !

The above is the detailed content of the method of building an ELK log system based on Docker. For more information about building an ELK log system with Docker, please pay attention to other related articles on 123WORDPRESS.COM!

You may also be interested in:
  • Example of using Docker to build an ELK log system
  • Detailed explanation of how to use Docker to quickly deploy the ELK environment (latest version 5.5.1)
  • Detailed explanation of using ELK to build a Docker containerized application log center
  • Docker builds ELK Docker cluster log collection system

<<:  CSS realizes process navigation effect (three methods)

>>:  Tudou.com front-end overview

Recommend

Detailed explanation of long transaction examples in MySQL

Preface: The "Getting Started with MySQL&quo...

Vue uses dynamic components to achieve TAB switching effect

Table of contents Problem Description What is Vue...

Detailed explanation of the difference between tinyint and int in MySQL

Question: What is the difference between int(1) a...

Detailed explanation of execution context and call stack in JavaScript

Table of contents 1. What is the execution contex...

js to make a simple calculator

This article shares the specific code of making a...

How to modify mysql permissions to allow hosts to access

Enable remote access rights for mysql By default,...

Optimizing JavaScript and CSS to improve website performance

<br /> In the first and second parts, we int...

Windows Server 2016 Standard Key activation key serial number

I would like to share the Windows Server 2016 act...

Example of how to create and run multiple MySQL containers in Docker

1. Use the mysql/mysql-server:latest image to qui...

Explanation of the problem that JavaScript strict mode does not support octal

Regarding the issue that JavaScript strict mode d...

Using JS to determine the existence of elements in an array in ten minutes

Preface In front-end development, you often need ...

Detailed explanation of JavaScript's Set data structure

Table of contents 1. What is Set 2. Set Construct...

10 tips for designing useful, easy-to-use web applications

Here are 10 tips on how to design better-usable w...

What to do if you forget your Linux/Mac MySQL password

What to do if you forget your Linux/Mac MySQL pas...