Logging using Elastic Cloud for MuleSoft using HTTP Appender

Author: Mohammad Mazhar Ansari

What is ELK?

  • ELK is an acronym for three open source projects: Elasticsearch, Logstash, and Kibana
  • Elasticsearch is a search and analytics engine
    • Elasticsearch is an open source, full-text search and analysis engine, based on the Apache Lucene search engine
  • Logstash is a server-side data processing pipeline that ingests data from multiple sources simultaneously, transforms it, and then sends it to a “stash” like Elasticsearch
  • Kibana lets users visualize data with charts and graphs in Elasticsearch

Why do we need a system like ELK?

  • Log aggregation and efficient searching
  • Generic Search

There are three main reasons we need ELK

  • It’s Interoperable
  • It’s Open Source
  • It’s Managed

What is Elastic Cloud?

Elastic cloud is offered from ELK on AWS. There are few benefits:

  • On-demand computing
  • Pay only for what you use
  • Failover and fault tolerance
  • Common coding
  • Ease of implementation

Register with Elasctc.co:

  • Go to https://elasctic.co/
  • Click on Free Trial if you don’t already have an account
  • When you login for first time, it will ask you to create a default ELK deployment which will install ElasticSearch and Kibana installation as below:
  • It will also create username and password for elastic search, so please note it down as it will be use full for later stages.

Enable Custom Logging For On Premise MuleSoft Runtime:

  • Go to Project
  • Open src/main/resoruces/log4j2.xml
  • Add below xml tag in Configuration/Appenders
<Http name="ELK-Cloud" url="https://bf1718d968f54023b4b4e75badd4fee2.us-east-1.aws.found.io:9243/applogs/_doc"> 
	<Property name="Authorization" value="Basic ZWxhc3RpYzoxREd6b0RTMWdTNTdNYlJMbjJ5S2UyU1k=" />
	<Property name="Content-Type" value="application/json" />
	<PatternLayout pattern='%msg' />
</Http>
  • Add below xml tag in Configuration/Loggers/AsyncRoot
<AppenderRef ref="ELK-Cloud" />
  • Run Application
  • See the Console Log

Note: For this exercise I am using JSON Logger for logging. Make sure you use PatternLayout as mentioned above.

  • Now go to Kibana (https://96c0390b651146119124148b6605cc3c.us-east-1.aws.found.io:9243/) -> Management -> Index pattern
  • Click on Create Index Pattern
  • You can see a new index filebeat-7.6.1-2020.03.30 is created. This Index is created because of line # 15 of the logstash configuration file. Select it and click on Next Step
  • Click on dropdown and select @timestamp and click on Create Index Pattern
  • Start Mule application for which log you have configured in filebeat configuration (line # 5)
  • Run few cases so Mule file can generate the logs
  • Go to Kibana (https://96c0390b651146119124148b6605cc3c.us-east-1.aws.found.io:9243/) -> Discover
  • Select Index Pattern Create in previous step
  • In Search you can write any suitable expression to search specific text from log file

Enable Custom Logging For CloudHub Application: 

Before enabling the logging for cloudhub application, you need to disable CloudHub logs. By default this option is not available and you need to raise a ticket with MuleSoft for providing this option. 

Once you have disabled cloudhub logs, MuleSoft is not responsible for below things:

  • MuleSoft is not responsible for lost logging data due to misconfiguration of your own log4j appender.
  • MuleSoft is also not responsible for misconfigurations that result in performance degradation, running out of disk space, or other side effects.
  • When you disable the default CloudHub application logs, then only the system logs are available. For application worker logs, please check your own application’s logging system. Downloading logs is not an option in this scenario.
  • Only Asynchronous log appenders can be used, Synchronous appenders should not be used.
  • Use asynchronous loggers and not synchronous ones to avoid threading issues. Synchronous loggers can lock threads waiting for responses.

You need to create log4j2.xml at location src/main/resources.

You need to add an Http appender inlog4j.xml. Provide url to connect Logz.io with token and type of logger, source, pattern layout etc.

<Http name="ELK-Cloud" url="https://bf1718d968f54023b4b4e75badd4fee2.us-east-1.aws.found.io:9243/applogs/_doc"> 
	<Property name="Authorization" value="Basic ZWxhc3RpYzoxREd6b0RTMWdTNTdNYlJMbjJ5S2UyU1k=" />
	<Property name="Content-Type" value="application/json" />
	<PatternLayout pattern='%msg' />
</Http>

You can add more loggers like Log4J2CloudhubLogAppender into your log4j2.xml to enable logging on the cloudhub log console of your application.

<Log4J2CloudhubLogAppender name="CLOUDHUB"
			addressProvider="com.mulesoft.ch.logging.DefaultAggregatorAddressProvider"
			applicationContext="com.mulesoft.ch.logging.DefaultApplicationContext"
			appendRetryIntervalMs="${sys:logging.appendRetryInterval}"
			appendMaxAttempts="${sys:logging.appendMaxAttempts}"
			batchSendIntervalMs="${sys:logging.batchSendInterval}"
			batchMaxRecords="${sys:logging.batchMaxRecords}" memBufferMaxSize="${sys:logging.memBufferMaxSize}"
			journalMaxWriteBatchSize="${sys:logging.journalMaxBatchSize}"
			journalMaxFileSize="${sys:logging.journalMaxFileSize}"
			clientMaxPacketSize="${sys:logging.clientMaxPacketSize}"
			clientConnectTimeoutMs="${sys:logging.clientConnectTimeout}"
			clientSocketTimeoutMs="${sys:logging.clientSocketTimeout}"
			serverAddressPollIntervalMs="${sys:logging.serverAddressPollInterval}"
			serverHeartbeatSendIntervalMs="${sys:logging.serverHeartbeatSendIntervalMs}"
			statisticsPrintIntervalMs="${sys:logging.statisticsPrintIntervalMs}">
			<PatternLayout pattern="[%d{MM-dd HH:mm:ss}] %-5p %c{1} [%t]: %m%n" />
</Log4J2CloudhubLogAppender>

Below is full log4j2.xml which can be used for your application for enabling custom logging on cloudhub and http appender for logz.io.

<?xml version="1.0" encoding="UTF-8"?>
<Configuration status="INFO" name="cloudhub"
	packages="com.mulesoft.ch.logging.appender,org.apache.logging.log4j">
	<Appenders>
		<Log4J2CloudhubLogAppender name="CLOUDHUB"
			addressProvider="com.mulesoft.ch.logging.DefaultAggregatorAddressProvider"
			applicationContext="com.mulesoft.ch.logging.DefaultApplicationContext"
			appendRetryIntervalMs="${sys:logging.appendRetryInterval}"
			appendMaxAttempts="${sys:logging.appendMaxAttempts}"
			batchSendIntervalMs="${sys:logging.batchSendInterval}"
			batchMaxRecords="${sys:logging.batchMaxRecords}" memBufferMaxSize="${sys:logging.memBufferMaxSize}"
			journalMaxWriteBatchSize="${sys:logging.journalMaxBatchSize}"
			journalMaxFileSize="${sys:logging.journalMaxFileSize}"
			clientMaxPacketSize="${sys:logging.clientMaxPacketSize}"
			clientConnectTimeoutMs="${sys:logging.clientConnectTimeout}"
			clientSocketTimeoutMs="${sys:logging.clientSocketTimeout}"
			serverAddressPollIntervalMs="${sys:logging.serverAddressPollInterval}"
			serverHeartbeatSendIntervalMs="${sys:logging.serverHeartbeatSendIntervalMs}"
			statisticsPrint
IntervalMs="${sys:logging.statisticsPrintIntervalMs}">
			<PatternLayout pattern="[%d{MM-dd HH:mm:ss}] %-5p %c{1} [%t]: %m%n" />
		</Log4J2CloudhubLogAppender>
		<Http name="ELK-Cloud" url="https://bf1718d968f54023b4b4e75badd4fee2.us-east-1.aws.found.io:9243/applogs/_doc"> 
			<Property name="Authorization" value="Basic ZWxhc3RpYzoxREd6b0RTMWdTNTdNYlJMbjJ5S2UyU1k=" />
			<Property name="Content-Type" value="application/json" />
			<PatternLayout pattern='%msg' />
		</Http>
	</Appenders>
	<Loggers>
		<AsyncLogger name="org.mule.runtime.core.internal.processor.LoggerMessageProcessor" level="INFO" />
		<AsyncLogger name="com.mulesoft.agent" level="INFO" />
		<AsyncRoot level="INFO">
			<AppenderRef ref="CLOUDHUB" />
			<AppenderRef ref="ELK-Cloud" />
		</AsyncRoot>
		<AsyncLogger name="com.gigaspaces" level="ERROR" />
		<AsyncLogger name="com.j_spaces" level="ERROR" />
		<AsyncLogger name="com.sun.jini" level="ERROR" />
		<AsyncLogger name="net.jini" level="ERROR" />
		<AsyncLogger name="org.apache" level="WARN" />
		<AsyncLogger name="org.apache.cxf" level="WARN" />
		<AsyncLogger name="org.springframework.beans.factory" level="WARN" />
		<AsyncLogger name="org.mule" level="INFO" />
		<AsyncLogger name="com.mulesoft" level="INFO" />
		<AsyncLogger name="org.jetel" level="WARN" />
		<AsyncLogger name="Tracking" level="WARN" />
		<AsyncLogger name="org.mule" level="INFO" />
		<AsyncLogger name="com.mulesoft" level="INFO" />
		<AsyncLogger name="org.mule.extensions.jms" level="INFO" />
		<AsyncLogger name="org.mule.service.http.impl.service.HttpMessageLogger" level="INFO" />
		<AsyncLogger name="org.mule.extension.salesforce" level="INFO" />
		<AsyncLogger name="org.mule.extension.ftp" level="INFO" />
		<AsyncLogger name="org.mule.extension.sftp" level="INFO" />
		<AsyncLogger name="com.mulesoft.extension.ftps" level="INFO" />
		<AsyncLogger name="org.mule.modules.sap" level="INFO" />
		<AsyncLogger name="com.mulesoft.extension.mq" level="INFO" />
		<AsyncLogger name="com.mulesoft.mq" level="INFO" />
		<AsyncLogger name="org.mule.extension.db" level="INFO" />
		<AsyncLogger name="httpclient.wire" level="DEBUG" />
		<AsyncLogger name="org.mule.transport.email" level="DEBUG" />
	</Loggers>
</Configuration>

This is how you can enable Log.io logging using HTTP Appender for MuleSoft applications.

We use cookies on this site to enhance your user experience. For a complete overview of how we use cookies, please see our privacy policy.