This tutorial explains how to integrate the IBM API Connect Developer Portal with Splunk using a lightweight data storage and shipping tool called Logstash. The solution can be enhanced to include multiple data inputs and outputs for local and remote locations, however, this document only proves the shipping of events from the Developer Portal to one endpoint. Multiple inputs and endpoints are configured in exactly the same way.
Overview of the required steps:
• Install Logstash on Developer Portal Server
• Configure Splunk Input Syslog Collector
• Configure Logstash input and output
• Run Logstash and ingest Dev Portal events into Splunk
Summary of components needed to execute this solution:
• API Connect Developer Portal (running Linux Debian) with Java 8 installed.
• Logstash 5.5 (This is the data processing pipeline that allows you to pull and push data to/from a wide variety of sources)
• A running Splunk UDP Collector that is configured to connect to Splunk on a specific port, accessible from the IBM APIC Developer Portal Console.
This document was produced based on a solution using a local deployment of API Connect version 5072 and a local deployment of Splunk.
1) Installing Logstash
Execute the following commands to install Logstash on the Developer Portal:
    $ wget -qO - | sudo apt-key add
    $ sudo apt-get install apt-transport-https
    $ echo "deb stable main" | sudo tee -a /etc/apt/sources.list.d/elastic-5.x.list
    $ sudo apt-get update && sudo apt-get install logstash

Note: When running Logstash, you may get an error involving the ‘path.settings’ in logstash.yml, which looks like this:

“WARNING: Could not find logstash.yml which is typically located in $LS_HOME/config or /etc/logstash. You can specify the path using –path.settings. Continuing using the defaults. Could not find log4j2 configuration at path //usr/share/logstash/config/ Using default config which logs to console”
The workaround for this is as follows:
     $ sudo mkdir /usr/share/logstash/config
     $ cd /etc/logstash
     $ sudo cp jvm.options logstash.yml startup.options /usr/share/logstash/config


2) Logstash Configuration
Create a new Logstash configuration file called ‘portal_logging.conf’ using:
$ cd /etc/logstash/conf.d/
$  nano portal_logging.conf


Enter the following contents:

/ Start of Logstash Code
input {
file {
type => "syslog"
path => [ "/var/log/syslog" ]
output {
stdout { codec => rubydebug }
udp {
host => "

//End of Logstash Code

This Logstash code implements the following:
input: this locates, reads and stores the data from the file in the path ‘var/log/syslog’.
output: This prints the data from the input at the standard output (i.e. prints the result in the command line), and also transmits the syslog data over UDP to the Splunk server with ‘IP:Host = :PORT’.

Once finished, save the file and run it with the commands below.

3) Running the Logstash operation

This command runs the logstash config file ‘portal_logging.conf’.
      $ /usr/share/logstash/bin/logstash -f /etc/logstash/conf.d/portal_logging.conf --path.settings /usr/share/logstash/config/

The output should be similar to the following:

If we navigate to our Splunk UI, we should also see the normalised events from the Developer Portal.

Join The Discussion

Your email address will not be published. Required fields are marked *