Home > Software > How to Deploy and Use Logstash with Docker Compose

How to Deploy and Use Logstash with Docker Compose

Anastasios Antoniadis

Share on X (Twitter) Share on Facebook Share on Pinterest Share on LinkedInLogstash, part of the Elastic Stack, is a powerful open-source tool used for real-time data processing and forwarding. It can dynamically ingest data from various sources, transform it, and then send it to your desired stash. On the other hand, Docker Compose is …

Docker (1)

Logstash, part of the Elastic Stack, is a powerful open-source tool used for real-time data processing and forwarding. It can dynamically ingest data from various sources, transform it, and then send it to your desired stash. On the other hand, Docker Compose is a tool for defining and running multi-container Docker applications. Combining these two, you can easily set up and manage Logstash configurations in a containerized environment, ensuring that your data processing workflows are both scalable and easily reproducible. This article will delve into how to set up Logstash using Docker Compose, including a practical example (you can also check this article if you want to deploy the full Elastic Stack with Elasticsearch, Kibana, and Logstash with Docker Compose).

Understanding Logstash and Docker Compose

Before we jump into the practical example, it’s essential to understand the basics of Logstash and Docker Compose:

  • Logstash: It is designed to handle a large volume of data logs or events from different sources, process them, and then forward them to a destination like Elasticsearch. Logstash supports multiple input, filter, and output plugins, making it extremely versatile for data processing.
  • Docker Compose: A tool for defining and running multi-container Docker applications. With a single command, you can configure, start, and stop all the services defined in a docker-compose.yml file.

Prerequisites

Before you begin, have Docker and Docker Compose installed on your system. The Docker Compose file will define the Logstash service and any other services you might need, such as Elasticsearch for data storage and Kibana for data visualization.

Step-by-Step Guide with Example

1. Create a Docker Compose File

First, create a docker-compose.yml file in your project directory. This file will define your Docker environment’s services, networks, and volumes. Here’s an example that sets up Logstash and Elasticsearch:

version: '3.7'
services:
  elasticsearch:
    image: docker.elastic.co/elasticsearch/elasticsearch:7.10.0
    environment:
      - discovery.type=single-node
    ports:
      - "9200:9200"
      - "9300:9300"

  logstash:
    image: docker.elastic.co/logstash/logstash:7.10.0
    volumes:
      - ./logstash/config/logstash.yml:/usr/share/logstash/config/logstash.yml
      - ./logstash/pipeline:/usr/share/logstash/pipeline
    ports:
      - "5000:5000"
    depends_on:
      - elasticsearch

This configuration starts Elasticsearch and Logstash containers. The depends_on directive ensures Logstash starts only after Elasticsearch is up.

2. Configure Logstash

Next, create a Logstash configuration. Logstash configurations are split into two parts: settings in logstash.yml and pipeline configurations.

  • logstash.yml: Contains Logstash settings. You might not need to customize it for basic setups.
  • Pipeline configurations: Define your data processing pipeline’s input, filter, and output stages.

Create a logstash directory in your project, then add a config and a pipeline directory inside it. For the purpose of this example, create a pipeline/logstash.conf file with the following content:

input {
  tcp {
    port => 5000
    codec => json_lines
  }
}

filter {
  # Add your filters here
}

output {
  elasticsearch {
    hosts => ["http://elasticsearch:9200"]
    index => "logstash-%{+YYYY.MM.dd}"
  }
}

This configuration sets up Logstash to receive data over TCP on port 5000, expecting JSON-formatted input. The output is directed to Elasticsearch.

3. Running Your Docker Compose Setup

With your docker-compose.yml and Logstash configuration in place, start your services using the following command:

docker-compose up -d

This command launches your containers in detached mode. You can now send data to Logstash on TCP port 5000, and it will be processed according to your pipeline configuration and forwarded to Elasticsearch.

4. Verifying Your Setup

To ensure everything is working as expected, send a JSON-formatted log to Logstash using a tool like nc (Netcat):

echo '{"message":"Hello, Logstash!"}' | nc localhost 5000

Check the Elasticsearch indices to verify that your log has been indexed:

curl -X GET "localhost:9200/_cat/indices?v"

You should see an index named logstash-YYYY.MM.dd containing your log entry.

Conclusion

Using Docker Compose to run Logstash provides a straightforward and efficient way to set up your log processing pipeline in a containerized environment. This setup not only simplifies the development and testing of your Logstash configurations but also ensures that your data processing infrastructure is easily scalable and

Anastasios Antoniadis
Follow me
0 0 votes
Article Rating
Subscribe
Notify of
guest
0 Comments
Inline Feedbacks
View all comments
0
Would love your thoughts, please comment.x
()
x