- 12 minutes to read

Custom Connector for enabling logging from Mule ESB

Use the Nodinite Custom connector instead of a Logger and/or Business Events shape for enabling logging from Mule ESB on-premise and/or in the cloud!

Supported versions:

  • 3.9
  • 3.8
  • 3.7
  • 3.6

For Mulesoft run-time 4 and later, you will need to invent the custom connector on your own.

The Nodinite Custom connector is intended to be used as a template to help get you started quickly.

graph LR subgraph "Mule ESB" subgraph "Flow 1" roFlow1(fal:fa-sitemap INT001: Invoices) end subgraph "Flow 2" roFlow2(fal:fa-sitemap INT002: Exchange rates ) end subgraph "Logging sub flow" roFlow1 --> roLogSink roFlow2 --> roLogSink roLogSink("fal:fa-bolt Nodinite Custom Connector") --> roId1["fal:fa-list Queue
fal:fa-folder Folder
..."] end end subgraph "Nodinite Server" roLogAPI(fal:fa-cloud-download LogAPI) roPS(fal:fa-truck-pickup Pickup Service)--> roLogAPI roId1 -. Async.-x roPS end

Example illustration of using the Nodinite logging custom connector with Mule ESB flows

Example usage of the Nodinite custom connector for Mule ESB flows

To enable logging from your Mule ESB flows, simply add the Nodinite Custom Connector (Prefixed Log ... in the examples). You should use the Nodinite Custom connector instead of the Logger or Business events shapes.

Feature Pros Cons Comment
Logger Built in - Limited options and hardcodes your solution
- Hard to replace
- Requires tons of code/configuration
- Hard to get equal quality of logging with many peoples over time
- Log Files (on-premise) wastes disk space and must be deleted or consumed somehow
- Long delays before data is visible using the API:s in the cloud (we have seen examples > 12 hours)
- Configuration must be changed when moving from on-premise to the cloud
Requires a configured log appender
Business Events Built in All cons as above and in addition:
- Requires Enterprise edition
- Limited to provided key/values (no payload)
Expensive and very limited options
Nodinite Custom Connector - Body always gets logged
- Consistent way to provide key/values (Context properties)
- Easy to replace with other logging solution when that day comes
- Faster time for logged events to be visible for the business
- All properties and logic for logging is within the custom connector and the one sub flow for logging
- Consistent quality of logging is easier to achieve
- Easy and equal use on-premise and in the cloud, no changes to code within your Mule ESB flows when migrating to the cloud or the other way
Requires adding the Log shape (custom connector) to all your flows (to enable logging there is no choice) Simply download our template and start using it instead of the Logger and Business events shapes

How do I get started with logging from Mule ESB using the Nodinite custom connector?

First, make sure you have read the Asynchronous Logging user guide and have the Pickup Service installed

Step 1: Download the Nodinite Custom Connector

From the Nodinite portal first download the zip file with the Nodinite custom connector

Step 2: Add Custom Connector to Anypoint Studio

Simply add the ZIP file to your Anypoint studio. The end result should look something like the following image:

Anypoint Studio 6 Custom Connector - Anypoint Studio
Example of Nodinite Custom connector when added to Anypoint studio

Anypoint Studio 7 Import Jar package Import

Step 3: Add Custom Connector to Flow

From within your Mule ESB Flows add the Nodinite logging custom connector where applicable (before and after transformation, exception handling, conditional flows, sub flows, ...) Example flow

Step 4: Code for Target destination

Logging is implemented as a generic reusable sub flow.
Sub flow logging
Example of logging sub flow

The logging sub flow creates the JSON Log Event and posts it to some destination like one of the following:

Destination Pros Cons Comment
Log API Available with Nodinite - Not highly available
- Connectivity must exist (what happens when you move to the cloud?)
- Nodinite and run-time platform gets updated from time to time and is not available
- Synchronous and is therefor not recommended
- Does not scale as well as queues and databases
Easy to get started with during POCs and initial testing. Avoid in production environments
ActiveMQ - Free (Open Source)
- Scales well - Supports fail-over clustering
- Requires additional coding and configuration Recommended
AnypointMQ - Scales well
- Highly available
- Requires enterprise subscription and is more costly Currently not supported by the Pickup Service, contact our support if this is your required option
PostgreSQL - Scales well - Requires additional coding and configuration

You must make a decision on which one to use. If you want to change destination with this approach you should only need to change it in exactly one; the one only location.

Step 5: Configure Pickup Service

The Pickup Service fetches your JSON Log Events from the destination you coded for in the previous step.

Step 6: Verify and fine-tune

Your last step now is to verify and fine-tune logging according to your business needs.

Tuning and best practices

Make sure to start easy and implement the use of the Custom Connector with a few Mule ESB flows before adding it everywhere.

The following event fields are mandatory and the rest of the fields are optional (set value to null or do not provide the field at all). By providing additional details about the Log Event, end-users may have a better user experience with Nodinite.

Mandatory Data Type Field Value Comment
string LogAgentValueId 42 Who (Log Agents) sent the data
string EndPointName "INT101: Receive Hello World Log Events" Name of Endpoint transport
string EndPointUri "C:\DropArea\in" URI for Endpoint transport
number EndPointDirection 0 Direction for Endpoint transport
number EndPointTypeId 60 Type of Endpoint transport
string OriginalMessageTypeName "https://nodinite.com/Customers/1.0#Batch" Message Type Name
string LogDateTime "2018-05-03T13:37:00.123Z" Client Log datetime (UTC format)

Review the Json Log Event user guide for full details.

Exception handling

Make sure to add the custom connector to your exception handling and set the Log Status Code according to your business case. The use of Log Status Codes is very flexible and provides the option for user-friendly texts for the business according to your own logic.

Correlation

Review the Mule ESB Correlation user guide.

Message Types

Providing the Message Type is mandatory.

Many of the Nodinite logging features depends on well known Message Types. For example when extracting values for Search Fields Nodinite uses Search Field Expressions and these are bound to named Message Types.

Make sure to provide unique names for message types to have the best logging experience with Nodinite

Context Options

Nodinite provides a plethora of logging options. Review the Context Options user guide for more information.

Repair and resubmit

Using the Context Options you can add properties to the logged JSON Log Events that Nodinite can use when repairing and resubmitting messages. Make sure to build your Mule ESB flows with a way to receive and deal with resubmitted messages.

Log Views

Make sure to involve your business and create role based self-service Log Views. Doing so will result in less incidents and a business that gets a better understanding for system integration solutions.

Next Step

How to Add or manage Search Fields
How to Add or manage Log Views


Example flow configuration using the downloadable Nodinite custom connector

<?xml version="1.0" encoding="UTF-8"?>

<mule xmlns:nodinite="http://www.mulesoft.org/schema/mule/nodinite" xmlns:http="http://www.mulesoft.org/schema/mule/http" xmlns:scripting="http://www.mulesoft.org/schema/mule/scripting" xmlns:file="http://www.mulesoft.org/schema/mule/file" xmlns:json="http://www.mulesoft.org/schema/mule/json" xmlns:batch="http://www.mulesoft.org/schema/mule/batch" xmlns:tracking="http://www.mulesoft.org/schema/mule/ee/tracking" xmlns="http://www.mulesoft.org/schema/mule/core" xmlns:doc="http://www.mulesoft.org/schema/mule/documentation"
	xmlns:spring="http://www.springframework.org/schema/beans" 
	xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
	xsi:schemaLocation="http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans-current.xsd
http://www.mulesoft.org/schema/mule/core http://www.mulesoft.org/schema/mule/core/current/mule.xsd
http://www.mulesoft.org/schema/mule/file http://www.mulesoft.org/schema/mule/file/current/mule-file.xsd
http://www.mulesoft.org/schema/mule/ee/tracking http://www.mulesoft.org/schema/mule/ee/tracking/current/mule-tracking-ee.xsd
http://www.mulesoft.org/schema/mule/json http://www.mulesoft.org/schema/mule/json/current/mule-json.xsd
http://www.mulesoft.org/schema/mule/batch http://www.mulesoft.org/schema/mule/batch/current/mule-batch.xsd
http://www.mulesoft.org/schema/mule/scripting http://www.mulesoft.org/schema/mule/scripting/current/mule-scripting.xsd
http://www.mulesoft.org/schema/mule/http http://www.mulesoft.org/schema/mule/http/current/mule-http.xsd
http://www.mulesoft.org/schema/mule/nodinite http://www.mulesoft.org/schema/mule/nodinite/current/mule-nodinite.xsd">
    <spring:beans>
        <spring:bean class="int001.filemove.transform.NodiniteLoggerService" name="CreateLogEventBean"/>
    </spring:beans>
    <http:request-config name="Nodinite_HTTP_Request_Configuration" host="demo.nodinite.com" port="80" basePath="/LogAPI/api/" doc:name="HTTP Request Configuration"/>
    <nodinite:config name="Nodinite__Configuration"  doc:name="Nodinite: Configuration"/>
    <flow name="int001.filemove.transformFlow">
        <file:inbound-endpoint path="C:\Temp\mule\int001.filemove.transform\in\" responseTimeout="10000" doc:name="File"/>
        <byte-array-to-string-transformer doc:name="Byte Array to String"/>
        <scripting:component doc:name="Groovy - set up logging variables">
            <scripting:script engine="Groovy"><![CDATA[sessionVars['correlationId'] = message.id;

message.correlationId = sessionVars['correlationId'];

sessionVars['nodiniteMessageTypeName'] = "MuleOrderBatch#1.0";
sessionVars['nodiniteLogText'] = "File Received";
sessionVars['nodiniteLogStatus'] = 0;
sessionVars['nodiniteEndPointDirection'] = 0;

sessionVars['nodiniteContextValues'] = [receivedFileName: flowVars['originalFilename']];

return payload;]]></scripting:script>
        </scripting:component>
        <logger message="Logging Correlation Id: #[sessionVars.correlationId]" level="INFO" doc:name="Logger"/>
        <flow-ref name="int001.filemove.nodiniteloggingflow" doc:name="Log"/>
        <choice doc:name="Choice">
            <when expression="flowVars.originalFilename.endsWith('.json')">
                <scripting:component doc:name="Update logging vars before looping">
                    <scripting:script engine="Groovy"><![CDATA[message.correlationId = sessionVars['correlationId'];

sessionVars['nodiniteMessageTypeName'] = "MuleOrderBatch#1.0";
sessionVars['nodiniteLogText'] = "Starting to debatch file";
sessionVars['nodiniteLogStatus'] = 0;
sessionVars['nodiniteEndPointDirection'] = 0;

sessionVars['nodiniteContextValues'] = [receivedFileName: flowVars['originalFilename']];

return payload;]]></scripting:script>
                </scripting:component>
                <flow-ref name="int001.filemove.nodiniteloggingflow" doc:name="Log Before Processing"/>
                <json:object-to-json-transformer mimeType="application/json" doc:name="Object to JSON"/>
                <json:json-to-object-transformer returnClass="java.util.List" doc:name="JSON to Object"/>
                <foreach collection="#[payload]" doc:name="For Each">
                    <flow-ref name="int0int001.filemove.processSingleOrder" doc:name="processOrder"/>
                </foreach>
            </when>
            <otherwise>
                <scripting:component doc:name="Update logging variables">
                    <scripting:script engine="Groovy"><![CDATA[sessionVars['nodiniteMessageTypeName'] = "Unknown";
sessionVars['nodiniteLogText'] = "Message extension is wrong - should be JSON.";
sessionVars['nodiniteLogStatus'] = -1;
sessionVars['nodiniteEndPointDirection'] = 0;

sessionVars['nodiniteContextValues'] = [receivedFileName: "#[flowVars.originalFileName]"];

return payload;]]></scripting:script>
                </scripting:component>
                <flow-ref name="int001.filemove.nodiniteloggingflow" doc:name="Log error message"/>
                <file:outbound-endpoint path="C:\Temp\mule\int001.filemove.transform\invalid\" outputPattern="#[flowVars.originalFilename]" responseTimeout="10000" doc:name="MoveToInvalidFolder"/>
            </otherwise>
        </choice>
    </flow>
    <sub-flow name="int0int001.filemove.processSingleOrder">
        <set-variable variableName="currentOrderId" value="#[payload.OrderId]" doc:name="Variable"/>
        <json:object-to-json-transformer doc:name="Object to JSON"/>
        <scripting:component doc:name="Before processing">
            <scripting:script engine="Groovy"><![CDATA[message.correlationId = sessionVars['correlationId'];

sessionVars['nodiniteMessageTypeName'] = "MuleOrder#1.0";
sessionVars['nodiniteLogText'] = "Starting to process order";
sessionVars['nodiniteLogStatus'] = 0;
sessionVars['nodiniteEndPointDirection'] = 0;

sessionVars['nodiniteContextValues'] = [receivedFileName: flowVars['originalFilename'], orderId: flowVars.currentOrderId.toString(), isGDPRData: "true"];

return payload;]]></scripting:script>
        </scripting:component>
        <flow-ref name="int001.filemove.nodiniteloggingflow" doc:name="Log single order"/>
        <file:outbound-endpoint path="C:\Temp\mule\int001.filemove.transform\out" outputPattern="#[flowVars.currentOrderId].json" responseTimeout="10000" doc:name="File"/>
        <scripting:component doc:name="After processing">
            <scripting:script engine="Groovy"><![CDATA[message.correlationId = sessionVars['correlationId'];

sessionVars['nodiniteMessageTypeName'] = "MuleOrder#1.0";
sessionVars['nodiniteLogText'] = "Done processing order";
sessionVars['nodiniteLogStatus'] = 0;
sessionVars['nodiniteEndPointDirection'] = 1;

sessionVars['nodiniteContextValues'] = [receivedFileName: flowVars['originalFilename'], orderId: flowVars['currentOrderId'].toString(), outFullName: "C:\\Temp\\mule\\int001.filemove.transform\\out\\" + flowVars['currentOrderId'].toString() + ".json", outDirectory: "C:\\Temp\\mule\\int001.filemove.transform\\out\\", outFileName: flowVars['currentOrderId'].toString() + ".json"];

return payload;]]></scripting:script>
        </scripting:component>
        <flow-ref name="int001.filemove.nodiniteloggingflow" doc:name="Log done single order"/>
    </sub-flow>
    <sub-flow name="int001.filemove.nodiniteloggingflow">
        <logger message="#[sessionVars.nodiniteLogText]" level="INFO" doc:name="Logger"/>
        <set-variable variableName="payloadBeforeNodiniteLogEvent" value="#[payload]" doc:name="Variable"/>
        <nodinite:create-log-event config-ref="Nodinite__Configuration" endPointName="int001.filemove.transformFlow" endpointUri="mule.developer" endPointDirection="#[sessionVars.nodiniteEndPointDirection]" originalMessageTypeName="#[sessionVars.nodiniteMessageTypeName]" logStatus="#[sessionVars.nodiniteLogStatus]" logText="#[sessionVars.nodiniteLogText]" payload="#[payload]" doc:name="Nodinite" processMachineName="roma.dev" processModuleName="int001.filemove.nodiniteloggingflow" processModuleType="Mule Flow" processName="int001.filemove.transformFlow">
            <nodinite:context-properties ref="#[sessionVars.nodiniteContextValues]"/>
        </nodinite:create-log-event>
        <http:request config-ref="Nodinite_HTTP_Request_Configuration" path="logEvent/logEvent" method="POST" doc:name="HTTP">
            <http:request-builder>
                <http:header headerName="Content-Type" value="application/json"/>
            </http:request-builder>
        </http:request>
        <set-payload value="#[flowVars.payloadBeforeNodiniteLogEvent]" doc:name="Set Payload"/>
    </sub-flow>
</mule>