You can monitor your IBM App Connect logs through the IBM Log Analysis with LogDNA service, which is a centralized logging platform that can be used to aggregate message logs for applications and services in IBM Cloud, and to monitor the log events from these different sources through a single view.

IBM Log Analysis with LogDNA offers log management capabilities that enable you to:

  • Configure multiple log sources that you want to collect data from, and create custom dashboards to view these logs.
  • Monitor logs in real time to troubleshoot issues and obtain insights into application and cloud environment data.
  • Use natural language query to search and filter through your application and service logs to quickly locate specific issues.
  • Send alert notifications for significant or critical application and service events to selected channels.
  • Export a set of log entries to a local file for reference or analysis, or archive log entries at selected intervals to IBM Cloud Object Storage for long-term storage.

For more information about IBM Log Analysis with LogDNA, see https://www.ibm.com/cloud/log-analysis.

Configuring IBM Log Analysis with LogDNA to receive logs from App Connect

To monitor and manage App Connect logs in IBM Log Analysis with LogDNA, you need an instance of the IBM Log Analysis with LogDNA service, which is provisioned under the same account and region as your App Connect instance in IBM Cloud. You must then configure this IBM Log Analysis with LogDNA instance to receive platform services logs. When App Connect emits messages about successful and failed trigger events, actions, and toolbox operations in your running flows, this information will be automatically collected and sent to the IBM Log Analysis with LogDNA instance. Messages that are issued by your IBM App Connect Enterprise integration servers which are running in the App Connect instance are also forwarded to the same IBM Log Analysis with LogDNA instance.

To configure your App Connect logs to display in IBM Log Analysis with LogDNA:

  1. Create flows in your App Connect instance. To add custom messages to your logs, you can optionally insert and configure log nodes in these flows. If required, also import BAR files for any integration servers that you want to run.
  2. If you haven’t already provisioned an IBM Log Analysis with LogDNA instance in the same IBM Cloud account and region as App Connect, create an instance with the required retention period for your logs. For more information, see Provisioning an instance in the IBM Log Analysis with LogDNA documentation.
  3. From the Observability dashboard in IBM Cloud, configure the IBM Log Analysis with LogDNA instance to receive service logs from App Connect (and any other provisioned services in your IBM Cloud account). This instance will be assigned a Platform services logs flag. For more information, see Configuring IBM Cloud service logs.
    • Example

      Steps to configure platform services logs from the Observability dashboard in IBM Cloud

      After you save, notice that the instance configured for service logs is assigned a Platform services logs flag.

      Configured IBM Log Analysis with LogDNA instance with the Platform services logs flag

    Note:

    • If you’ve already configured an IBM Log Analysis with LogDNA instance to receive logs for another service in the same account and region as App Connect, you can skip this step because only one IBM Log Analysis with LogDNA instance in a region can be configured to receive logs from enabled services in that IBM Cloud location.
    • App Connect logs will automatically be collected and forwarded to your IBM Log Analysis with LogDNA instance, so you do not need to manually configure a LogDNA agent to do so.
  4. Start your flows and integration servers in App Connect.

    You can now monitor the log entries in the IBM Log Analysis with LogDNA instance that’s configured for service logs. If you enable debug logging on a flow, debug information about the execution of the flow, including payload data, will also be forwarded to the IBM Log Analysis with LogDNA instance.

    Tip: You can also view App Connect logs by using the built-in log viewer, as described in Viewing App Connect logs in the log viewer.

Viewing and managing logs in your IBM Log Analysis with LogDNA instance

To view and manage your App Connect log entries, you must directly access the IBM Log Analysis with LogDNA web UI in IBM Cloud.

  1. Log in to IBM Cloud.
  2. Open the IBM Cloud navigation menu IBM Cloud menu icon and click Observability.
  3. Click Logging in the left pane to view the list of IBM Log Analysis with LogDNA instances that are available to you.
  4. Locate the instance with the Platform services logs flag and then click View LogDNA. (You might be prompted to log in again with your IBMid.)

    Accessing the LogDNA web UI

    In the IBM Log Analysis with LogDNA web UI, the log entries for App Connect (and any other configured services) are displayed in a predefined format with fields that show:

    • The timestamp of the log event (displayed in local time and ascending timestamp order by default).
    • A tag identifying the source of the log event. For App Connect, this value is appconnect.
    • A reference identifying the logging bucket (including the region) for the source application.
      For your App Connect instance, this should be a value starting with crn:v1:bluemix:public:appconnect:; for example, crn:v1:bluemix:public:appconnect:us-south:a/2abfae70f450c431f09a3325e8f7a07c:::.
    • The log level of the message; typically INFO, ERROR, or DEBUG.
    • A message describing the log event.

      If you hover to the left of any timestamp, you’ll see an arrow with a toggle action that lets you show or hide additional metadata about that log entry. For App Connect, this includes metadata such as:

      • An action that can help you resolve an error (if applicable)
      • Addition detail related to the message
      • An auto-generated flowId (changes whenever the flow is restarted)
      • The instanceID that’s assigned to your App Connect service instance
      • The transactionId of the transaction (or HTTP call) that’s associated with the log event

      Within the metadata section, you might also see the following links, which are dependent on your plan:

      • View in context: Lets you view this log entry in the context of other entries from the same source, app, or both
      • Copy to clipboard: Lets you copy the metadata
      • Share this line: Generates a link to this entry for sharing

    App Connect user logs in the IBM Log Analysis with LogDNA web UI

    The log entries are streamed in real time. If you’d like to switch off streaming so that you examine a particular log entry and expand its metadata, you can click the LIVE link to pause scrolling. Click the link again when you want to resume streaming.

    LIVE link to toggle scrolling of LogDNA entries

    The default EVERYTHING view shows all the log entries in your IBM Log Analysis with LogDNA instance, but you can customize this view to show different results by applying search and filter queries, and then save those queries in custom views, as described later. You can access all your views by clicking the Views icon LogDNA Views icon; the EVERYTHING view and all saved custom views are listed in the Views pane and can be selected from there.

    Some common features that you can use to monitor and manipulate your log data are described in the sections that follow. IBM Log Analysis with LogDNA offers different pricing plans, and the plan that you choose for your instance will determine which of these features are available to you. Your plan also determines the retention period of your logs and the number of users who can manage the data. For example, the Lite plan offers event streaming that lets you view the data as it is logged, but does not retain the data for searches, and it also excludes features such as Alerts, Archiving, Graphing, and Usage. For further details, see Pricing plans in the IBM Cloud documentation.

Changing the look of your display and format of your log entries

To change the look and format of your logs:

  • Click Settings LogDNA Settings icon > USER PREFERENCES to change the following settings:
    • Customize the colors, theme, and text size used in the web UI.
    • Switch between local time and UTC, or change the format of the log event timestamps.
    • Change the order in which the default fields are displayed in each log entry, or add or remove fields (from a supplied set) to change the content displayed.
  • Add metadata information to the log lines in your current view by using custom line templates. You’ll find this function useful if you’d like to enrich the log lines with additional data.
    1. From your current view, click the Toggle Viewer Tools button LogDNA Toggle Viewer Tools button to open the viewer tools panel. This panel contains a LINE TEMPLATE field, which you can use to add extra data to your log lines.

      LogDNA viewer tools panel

    2. Complete the LINE TEMPLATE field by following these conventions:
      • Use {{line}} or $@ to reference the default content of a log line. For example, a line such as this:

        Default content of a sample log line

      • Use {{fieldname}} to reference the value of a metadata field. For example, to add the instanceId field value in the following image to the log line, reference the field as {{instanceId}}.

        Expanded metadata fields for a log entry

        If your metadata fields are enclosed within a Meta Object grouping, you must instead use {{_meta.fieldname}} to reference the value of a metadata field; for example, {{_meta.hostname}}.

        Expanded metadata fields within a Meta Object grouping

      • You can reference one or more metadata fields in the log line, and can also add static text to the log line, as a suffix. For example:
        • To append the instanceId and flowId field values to the log line, type either of these values and press Enter:
          {{line}} {{instanceId}} {{flowId}}
          $@ {{instanceId}} {{flowId}}
          • Example

            Original view with the Toggle Viewer Tools button:

            Toggle Viewer Tools button

            LINE TEMPLATE value and results:

            Example of line template value and the results

        • To append flow-id as static text, followed by the flowId field value, type this value and press Enter:
          $@ flow-id {{flowId}}
          • Example

            Original view with the Toggle Viewer Tools button:

            Toggle Viewer Tools button

            LINE TEMPLATE value and results:

            Example of line template value and the results

Filtering your log entries (and saving as a view)

To filter the log entries in the EVERYTHING view or in one of your saved or unsaved custom views:

  1. Click Views LogDNA Views icon to select the view if required, and then apply filter conditions for the log sources, apps, and log levels to be displayed. You can click All Sources, All Apps, or All Levels to expose a drop-down filter panel, select the required check boxes, and then click away from the panel to apply that filter. The check boxes that you see will depend on which sources are configured to forward logs to IBM Log Analysis with LogDNA.

    For example, to filter the EVERYTHING view to show only those entries originating from App Connect, and all related apps and log levels, you can make the following selections: EVERYTHING (default), All Sources > appconnect, All Apps (default), and All Levels (default). Notice that the page refreshes to show the filtered results as an Unsaved View.

    Sample filter query in the LogDNA web UI

    Or, to view only those App Connect log entries with a log level or ERROR, you could apply the following filters. (And you could further fine-tune your results if required by defining search queries, as described in a later step.)

    Sample filter query in the LogDNA web UI for App Connect errors

  2. Optionally choose to save your filter query as a custom view by clicking Unsaved View and then specifying a name (and a category and alert if required). Saved views are listed in the Views pane to the left of the log entries. In the following example, we’ve saved the view as VIEWappconnect.

    Sample saved view for a filter query

    For more information, see Filter logs in the IBM Cloud documentation. Also see How to Filter Logs and Create Views & Alerts in the LogDNA documentation.

Defining and managing alerts

You can define and manage alerts to send notifications about your logged events (based on count and frequency) to a target channel. You can create a predefined alert or preset, which can be attached to any number of custom views, or you can create an alert that is specific to a view. When you create an alert, you can configure multiple conditions and notification channels.

  • To create an alert preset:
    1. Click Settings LogDNA Settings icon > ALERTS. (From here, you can ad presets, and edit or delete view-specific alerts.)
    2. Click Add Preset and specify a unique name for the preset.
    3. Configure the alert by selecting a notification channel for the alert (for example, Email or Slack). Then define presence or absence alerting, choose a target to send the alert to (for example, email recipients), and select the timezone.
      • Presence alert definition: When x or more events are logged within y seconds, minutes, or hours, trigger an alert at the end of the specified time period, or as soon as x events are logged, or both. For example, if your current view is filtered to show App Connect log entries with an ERROR log level, you might want to trigger an alert when 30 (or more) events are logged within a 30-second time period.
      • Absence alert definition: When fewer than x events are logged within y seconds, minutes, or hours, trigger an alert at the end of the specified time period. For example, if your current view shows the log entries for a “data copy” flow, you might want to trigger an alert if fewer than 5 events are logged within a 1-minute time period.
    4. Use the + tab to configure other alert conditions for the same or for different notification channels.
    5. Save the alert preset.
  • To create an alert that is specific to a view:
    1. Either create the alert while saving an unsaved view, or create the alert directly within a saved view.
      • Example: Creating a view-specific alert while saving a view

        Sequence for creating a view-specific alert while saving a view

      • Example: Creating a view-specific alert in a saved view

        Sequence for creating a view-specific alert in a saved view

    2. Configure the alert, as described in the preceding steps for an alert preset.
    3. Save the alert to your view.
  • To attach an alert preset to a saved or unsaved custom view that you are currently viewing:
    1. Expand the drop-down menu for the view and click Attach an alert (for a saved view) or Save as new view / alert (for an unsaved view).
    2. For a saved view, select the required preset from the Alert drop-down list. For an unsaved view, click Attach an alert and select the required preset from the Alert drop-down list.
    3. Save the view.

A view that contains an attached alert is annotated with a bell icon in the Views pane. (For additional details about alerts, see Working with alerts in the IBM Cloud documentation and Create Views & Alerts in the LogDNA documentation.)

Bell icon depicting that a custom view has attached alerts

Exporting or archiving your log data

You can export the log entries in your current view to a local file. You can also archive logs from your IBM Log Analysis with LogDNA instance to a bucket in an IBM Cloud Object Storage (COS) instance.

  • To export the log entries in your current view:
    1. Ensure that the required search and filter queries are applied. Then click to expand the drop-down menu for the saved or unsaved view, and click Export lines.

      Selecting the option to export log entries in the current LogDNA view

    2. Specify a time range for the data to be exported, and request the import. The data results are exported as a compressed JSONL file, and an email with a link to this file is sent to the email address of your IBMid account.
    3. To download the file, click the link in the email. (Be sure to download the file before the link expires, as indicated in the email.)

      For more information, see Exporting logs to local file in the IBM Cloud documentation.

  • To archive your logs, see Archiving logs to IBM Cloud Object Storage in the IBM Cloud documentation.

Monitoring and managing your data usage

To monitor and manage the data usage for your configured services and applications, click Settings LogDNA Settings icon > USAGE.

Creating graphical representations of your log data

You can create visualizations that let you monitor and report on your log data in chart form. To do so, click Boards LogDNA Boards icon to add a board, and then create one or more related graphs within the board.

For each graph, you can choose to report on all your log entries (with an optional filter), or report on a single field (and an optional extra) from a predefined set of fields. The first graph in the board initially charts a day’s worth of data, counting back from the current time. You can change this time range, but be aware that it’s shared by all graphs in the board. You can hover over the data points on the graph to see information for that point in time, and can click a data point and then click Show Logs to skip to that point in the logs. You can also click and drag over an area to zoom in for a closer analysis of that interval, or you can apply a filter to a specific graph, or collectively to all graphs in the board for fine tuning.

Sequence of actions for adding a board and graph to LogDNA

You can also add a breakdown to analyze the data distribution of a selected field’s values. In the following example, the breakdown is defined as a pie chart that shows the distribution of log entries per log level (info, error, and debug).

Sequence of actions for defining a breakdown

You can also plot a single query on a graph.

For more information, see Graphing Logs in the LogDNA documentation.

Join The Discussion

Your email address will not be published. Required fields are marked *