Logstash 9.x: Centralized Pipeline Management

Published on 2025-08-02 01:00:00

« See All Lectures Contact Us
Logstash 9.x: Centralized Pipeline Management

Introduction

We will configure logstash pipelines through Elasticsearch and Kibana.

We assume you already have Elasticsearch and created as per this tutorial.

This guide is based on these documentations:

Logstash Centralized Pipeline Management

Configuring Centralized Pipelines

Requirements

In the video, we used one instance of Ubuntu 24.04 running on a VM with 8GB memory. VM will run in a local private network. We will install elastic, kibana logstash to this server.

Steps

Step 1 - Create Logstash User

Click to Kibana > Stack Management > Users > Security > Create User > Create User.

Then create a new user called logstash_admin_user and give it the roles logstash_admin and logstash_system.

Step 2 - Create Pipelines

Click to Kibana > Stack Management > Ingest > Logstash Pipelines > Create Pipeline.

Call the pipeline demo1 and insert this pipeline:

input { file { path => "/var/lib/logstash/data/customers.csv" start_position => "beginning" sincedb_path => "/dev/null" } } filter { csv { separator => "," skip_header => "true" columns => ["email", "first_name", "last_name", "city", "county", "state", "zip", "web"] } mutate { convert => { "zip" => "integer" } } } output { file { path => "/var/lib/logstash/testing/output.txt" } }

And make a second pipeline called demo2:

input { file { path => "/var/lib/logstash/data/tasks.csv" start_position => "beginning" sincedb_path => "/dev/null" } } filter { csv { separator => "," skip_header => "true" columns => ["task_name", "task_description"] } } output { file { path => "/var/lib/logstash/testing/output.txt" } }

We will create the test content next.

Step 3 - Setup Logstash

Logstash needs these dependencies, so run these commands on both if you have not done so yet (eg. these are also used by elasticsearch and kibana):

wget -qO - https://artifacts.elastic.co/GPG-KEY-elasticsearch | sudo gpg --dearmor -o /usr/share/keyrings/elasticsearch-keyring.gpg echo "deb [signed-by=/usr/share/keyrings/elasticsearch-keyring.gpg] https://artifacts.elastic.co/packages/9.x/apt stable main" | sudo tee /etc/apt/sources.list.d/elastic-9.x.list apt-get update; apt-get install -y apt-transport-https;

Now install Logstash:

apt-get install -y logstash;

Get a copy of the elasticsearch CA:

mkdir /etc/logstash/certs cp /etc/elasticsearch/certs/http_ca.crt /etc/logstash/certs chown -R logstash:logstash /etc/logstash

Add these lines (or edit existing ones):

xpack.management.enabled: true xpack.management.pipeline.id: ["demo1", "demo2" ] xpack.management.elasticsearch.username: logstash_admin_user xpack.management.elasticsearch.password: ABCD1234 xpack.management.elasticsearch.hosts: ["https://192.168.88.7:9200"] xpack.management.elasticsearch.ssl.certificate_authority: "/etc/logstash/certs/http_ca.crt"

Step 4 - Make Test Data

mkdir -p /var/lib/logstash/testing
mkdir -p /var/lib/logstash/data

cat > /var/lib/logstash/data/customers.csv <<EOL
email,first_name,last_name,city,county,state,zip,web
carol.davis@example.net,Carol,Davis,Seattle,King,WA,98101,www.caroldavisexample.net
faizal@helloworldexample.com,Faizal,Gupta,Kingston,King,WA,93211,www.helloworldexample.com
EOL

cat > /var/lib/logstash/data/tasks.csv <<EOL
task_name,task_description
do homework,children love homework
wash dishes,dad loves to wash dishes
EOL

chown -R logstash:logstash /var/lib/logstash

Step 5 - Start Logstash

systemctl start logstash.service

If things are working, you should see a new file /var/lib/logstash/testing/output.txt with the output results of both demo1 and demo2.