Home > Software > How to Deploy the ELK Stack with Docker Compose

How to Deploy the ELK Stack with Docker Compose

Anastasios Antoniadis

Updated on:

Discover the seamless way to deploy the ELK (Elasticsearch, Logstash, Kibana) Stack using Docker Compose with our detailed guide. Streamline your log management and analysis, enhancing monitoring and insights into your applications.

Elastic + Docker

The ELK Stack—comprising Elasticsearch, Logstash, and Kibana—offers a powerful platform for real-time indexing, searching, analyzing, and visualizing data. Elasticsearch is the heart of the stack, offering a distributed search and analytics engine. Logstash processes data from various sources simultaneously, enriching and transforming it before feeding it into Elasticsearch. Kibana then provides the visualization layer, enabling users to create insightful dashboards to interpret their data. Docker Compose simplifies the deployment of the ELK Stack by defining and running multi-container Docker applications. This guide will walk you through deploying the ELK Stack using Docker Compose.

Prerequisites

Before starting, ensure you have the following:

  • Docker installed on your system.
  • Docker Compose installed on your system.
  • Basic understanding of Docker concepts and the YAML syntax used in Docker Compose files.

Step 1: Create a Docker Compose File

Create a directory dedicated to your ELK Stack setup. This directory will contain your Docker Compose file (docker-compose.yml) and any additional configuration files or directories you might need.

mkdir elk-docker && cd elk-docker

Create the docker-compose.yml file:

touch docker-compose.yml

Open this file in a text editor and insert the following configuration:

version: '3.8'
services:
  elasticsearch:
    image: docker.elastic.co/elasticsearch/elasticsearch:8.12.2
    container_name: elasticsearch
    environment:
      - discovery.type=single-node
    volumes:
      - elasticsearch-data:/usr/share/elasticsearch/data
    ports:
      - "9200:9200"
    networks:
      - elk

  logstash:
    image: docker.elastic.co/logstash/logstash:8.12.2
    container_name: logstash
    volumes:
      - ./logstash/config/logstash.yml:/usr/share/logstash/config/logstash.yml
      - ./logstash/pipeline:/usr/share/logstash/pipeline
    ports:
      - "5000:5000"
    networks:
      - elk
    depends_on:
      - elasticsearch

  kibana:
    image: docker.elastic.co/kibana/kibana:8.12.2
    container_name: kibana
    ports:
      - "5601:5601"
    networks:
      - elk
    depends_on:
      - elasticsearch

volumes:
  elasticsearch-data:

networks:
  elk:

Configuration Explained:

  • Elasticsearch Service: Configures the Elasticsearch service with a single-node discovery type and maps the necessary volume for data persistence.
  • Logstash Service: Sets up Logstash with custom configuration and pipeline files. It depends on Elasticsearch being available and maps relevant ports.
  • Kibana Service: Deploys Kibana, linking it to Elasticsearch, and exposes the default Kibana port.
  • Volumes: Declares named volumes for data persistence.
  • Networks: Defines a custom network (elk) for inter-service communication.

Step 2: Launch the ELK Stack

Navigate to the directory containing your docker-compose.yml file and start the ELK Stack by running:

docker compose up -d

This command will download the necessary Docker images (if not already present) and start the Elasticsearch, Logstash, and Kibana containers in detached mode.

Step 3: Verify the Services are Running

  • Elasticsearch: Confirm that Elasticsearch is operational by navigating to http://localhost:9200. You should see a JSON response with cluster information.
  • Logstash: Test Logstash by sending data to the input port defined in your Logstash pipeline configuration. For the default TCP input, you might use telnet localhost 5000 to send data manually.
  • Kibana: Access the Kibana UI by navigating to http://localhost:5601. If set up correctly, you should see the Kibana homepage, where you can create visualizations and dashboards based on your Elasticsearch data.

Conclusion

Deploying the ELK Stack with Docker Compose offers a straightforward method for setting up a comprehensive logging, analysis, and visualization platform. Following this guide, you can quickly get the ELK Stack running, ready to process and visualize your data in real time. Docker Compose simplifies the management of the ELK services, making it easy to maintain, backup, and scale your setup. Whether you’re monitoring application logs, analyzing system performance, or gaining insights into business metrics, the ELK Stack provides a powerful solution for managing and making sense of large volumes of data.

Anastasios Antoniadis
Follow me
0 0 votes
Article Rating
Subscribe
Notify of
guest
0 Comments
Inline Feedbacks
View all comments
0
Would love your thoughts, please comment.x
()
x