Logging using ELK for MuleSoft

Author: Mohammad Mazhar Ansari

What is ELK?

  • ELK is the acronym for three open source projects: Elasticsearch, Logstash, and Kibana
  • Elasticsearch is a search and analytics engine
    • Elasticsearch is an open source, full-text search and analysis engine, based on the Apache Lucene search engine
  • Logstash is a server-side data processing pipeline that ingests data from multiple sources simultaneously, transforms it, and then sends it to a “stash” like Elasticsearch
  • Kibana lets users visualize data with charts and graphs in Elasticsearch

Why do we need a system like ELK?

  • Log aggregation and efficient searching
  • Generic Search

There are three main reasons we need ELK:

  • It’s Interoperable
  • It’s Open Source
  • It’s Managed

How to download ELK?

ELK and File Beat can be downloaded from the locations below:

  • Elastic Search
    • https://www.elastic.co/downloads/elasticsearch
  • Kibana
    • https://www.elastic.co/downloads/kibana
  • Logstash
    • https://www.elastic.co/downloads/logstash
  • Filebeat
    • https://www.elastic.co/downloads/beats/filebeat

ELK General Architecture:

In general, the ELK architecture looks as shown in the image:

  • File Beat pools the file and sends the data to Logstash
  • Logstash gets data filter, processes it and sends it to Elasticsearch
  • Elasticsearch stores data in persistent store with Indexing
  • Kibana can pull data on demand and create Graph/Chart/Reporting

If in an Enterprise we have more than one server, then this is how typical ELK stack looks like:

As Logstash is heavy on resources, we can use filebeat on different which pushes the data to logstash.

Lets integrate ELK with MuleSoft:

  • Install ELK and FileBeat on you local system
  • Start Elasticsearch
  • Go to browser and open http://localhost:9200 and if Elasticsearch is running fine you will get output like below.
  • Start Kibana
  • Open Kibana in browser (http://localhost:5601)
  • Create a logstash configuration file as shown below
  • Line # 5 specifies the port logstash will listen
  • Line # 15 specifies the port Elasticsearch server where logstash forward the data
  • Run log stash with configuration created earlier
    • logstash.bat -f logstash-beat.conf
  • Create a Filebeat configuration file as shown below
  • Line # 5 specifies the log file to poll
  • You can add more log file similar to line # 5 to poll using same filebeat
  • Line # 7 specifies the pattern of log file to identify the start of each log
  • Line # 8 and 9 are required to each log span more than one line
  • Run Filebeat  with configuration created earlier
    • filebeat.exe -c filebeat.yml
  • Now go to Kibana (http://localhost:5601) -> Management -> Index pattern
  • Click on Create Index Pattern
  • You can see a new index filebeat-7.6.1-2020.03.30 is created. This Index is created because of line # 15 of the logstash configuration file. Select it and click on Next Step
  • Click on dropdown and select @timestamp and click on Create Index Pattern
  • Start Mule application for which log you have configured in filebeat configuration (line # 5)
  • Run few cases so Mule file can generate the logs
  • Go to Kibana (http://localhost:5601) -> Discover
  • Select Index Pattern Create in previous step
  • In Search you can write any suitable expression to search specific text from log file

Reference Material:

  • Installing the Elastic Stack on Windows (https://logz.io/blog/elastic-stack-windows/)
  • THE COMPLETE GUIDE TO THE ELK STACK (https://logz.io/learn/complete-guide-elk-stack/#installing-elk)
  • File Beat + ELK(Elastic, Logstash and Kibana) Stack to index logs to Elasticsearch (https://www.javainuse.com/elasticsearch/filebeat-elk)

We use cookies on this site to enhance your user experience. For a complete overview of how we use cookies, please see our privacy policy.