Home > Software > How to Use Filebeat & Logstash with Docker Compose

How to Use Filebeat & Logstash with Docker Compose

Anastasios Antoniadis

Share on X (Twitter) Share on Facebook Share on Pinterest Share on LinkedInEffective monitoring and analysis of logs are essential for ensuring optimal application performance and health in software development and operations. Docker, a widely used containerization platform, simplifies application deployment, scaling, and management. When used in conjunction with Elastic Stack (previously known as ELK …

Docker (1)

Effective monitoring and analysis of logs are essential for ensuring optimal application performance and health in software development and operations. Docker, a widely used containerization platform, simplifies application deployment, scaling, and management. When used in conjunction with Elastic Stack (previously known as ELK Stack), specifically Filebeat and Logstash, it becomes a powerful tool for managing logs. This article will walk you through a hands-on example of integrating Filebeat with Logstash in a Docker Compose environment.

Introduction to Filebeat, Logstash, and Docker Compose

  • Filebeat is a lightweight log shipper that forwards and centralizes log data. It is designed to monitor log files or locations and forward them to Elasticsearch or Logstash for indexing.
  • Logstash is a server-side data processing pipeline that ingests data from various sources simultaneously, transforms it, and then sends it to a “stash” like Elasticsearch.
  • Docker Compose is a tool for defining and running multi-container Docker applications. With a single command, you can configure all your application’s services, networks, and volumes in a YAML file and start all the services.

Scenario

For this example, we’ll set up a simple scenario where Filebeat monitors a log file and forwards the log entries to Logstash, which then processes and sends the data to a console output (stdout). This example focuses on integrating Filebeat and Logstash within a Docker environment, not the entire Elastic Stack.

Prerequisites

Ensure you have Docker and Docker Compose installed on your system. This example assumes you have basic knowledge of Docker and Docker Compose.

Step 1: Create a Docker Compose File

Create a file named docker-compose.yml and add the following content:

version: '3.7'
services:
  logstash:
    image: docker.elastic.co/logstash/logstash:7.10.0
    volumes:
      - ./logstash/pipeline:/usr/share/logstash/pipeline
    ports:
      - "5044:5044"
  filebeat:
    image: docker.elastic.co/beats/filebeat:7.10.0
    volumes:
      - ./filebeat/filebeat.yml:/usr/share/filebeat/filebeat.yml
      - ./logs:/usr/share/filebeat/logs
    depends_on:
      - logstash

Step 2: Configure Logstash

Create a directory named logstash and then a subdirectory named pipeline. Inside the pipeline directory, create a file named logstash.conf with the following content:

input {
  beats {
    port => 5044
  }
}
output {
  stdout { codec => rubydebug }
}

This configuration sets up Logstash to listen for incoming connections from Filebeat on port 5044 and outputs the log data to the console.

Step 3: Configure Filebeat

Create a directory named filebeat and inside it, create a file named filebeat.yml with the following content:

filebeat.inputs:
- type: log
  enabled: true
  paths:
    - /usr/share/filebeat/logs/*.log

output.logstash:
  hosts: ["logstash:5044"]

Next, create a logs directory where your log files will be stored. You can add a sample log file here to test the setup.

Step 4: Running the Containers

With the configuration files in place, run the following command in the directory containing your docker-compose.yml file:

docker compose up

This command starts both the Filebeat and Logstash services. Filebeat monitors the specified log file and forwards new entries to Logstash, which then outputs the data to the console.

Conclusion

This example demonstrates a basic integration of Filebeat and Logstash in a Docker environment. It shows how Docker Compose can simplify the deployment of multi-container applications and how Filebeat and Logstash can be configured to work together for log monitoring and processing. This setup can be expanded by integrating other components of the Elastic Stack, such as Elasticsearch and Kibana, for storing, searching, and visualizing log data.

Anastasios Antoniadis
Follow me
0 0 votes
Article Rating
Subscribe
Notify of
guest
0 Comments
Inline Feedbacks
View all comments
0
Would love your thoughts, please comment.x
()
x