IBM App Connect provides a built-in log viewer that you can use to manage and monitor messages that are emitted when your event-driven and API flows are executed. The log viewer also displays messages that are issued by your IBM App Connect Enterprise integration servers that are running in the App Connect on IBM Cloud instance.

Accessing the log viewer for all your running flows

To view and manage event messages that are logged for all your running flows:

  1. Open the App Connect menu App Connect menu icon and then click Manage > Logs to open the log viewer.

    Entries for each event that was logged for your running flows are displayed in individual rows under the following columns:

    • Event time (UTC)/Event time (local): A timestamp depicting when an event occurred in UTC (the default) or in your browser’s local time.
    • Message: A message describing the log event.
    • Transaction ID: An auto-generated ID that uniquely identifies the transaction (or HTTP call) associated with the event.
    • Flow ID: An ID that uniquely identifies the flow for which the event was logged. This ID changes whenever the flow is restarted.
    • Log level: The logging level of the message, with an assigned value of info, error, or debug. Information and error messages are displayed by default. To view debug messages, you’ll need to enable debugging, as described in Debugging your flows in IBM App Connect on IBM Cloud.

    By default, up to 500 of the most recent events, which were logged in the last 24 hours, are displayed on the page.

    Log entries in the log viewer

    Each log entry has an event metadata section, which you can expand to see additional details about that event. The arrow to the left of the Event time column can be used to expand or collapse this section.

    Viewing expanded metadata for a log entry

  2. To change the sort order of the log entries, click the Event time (UTC) column header. Entries are displayed in descending timestamp order by default, but you can switch from a descending to an ascending sort order.
  3. To change the default timestamp from UTC to local time, click the Change log settings icon Change log settings icon. Then click Browser local time.
  4. To apply a predefined time filter to see only those events that were logged during the selected time period, expand the time picker drop-down list and select a time. The default is Last 24 hours, but you can specify periods that range from the last 15 minutes to the last 30 days.

    Time picker drop-down list

  5. To search for a specific string within the log entries in the current view, click the Search logs on this page icon Search logs icon and then enter (or copy and paste) the text string that you want to search for. As you type, the view refreshes to show any rows with a matching (case insensitive) value in the Message, Transaction ID, Flow ID, or Log level column, and within the event metadata sections. You can delete the search query from the search bar to revert to your previous view.
  6. To apply a filter query to refine your view of the log entries:
    1. Click the Build a query to filter logs icon Build a query icon.
    2. Add a condition by selecting a log column header, an operator, and a term or timestamp to filter by. Then use Add condition to define additional conditions. For example, if you want to view all the log messages that were emitted before a specific time and for a specific flow, you can specify conditions for the Event time and Flow ID values as shown in the following example.

      Build a query example

      Tip: If you’d like to view log entries that were generated before the 500 entries in your current view, you can set an Event time condition to get more specific results. For example, you can use the before operator to retrieve the previous set of 500 (or fewer) entries that are generated before the earliest timestamp in your current view.

      Or you can use both the before and after operators to retrieve the log entries for a specific day or within a particular time period. For example, to retrieve the log entries for 9 July 2019, you can build a query with the following conditions:

      Build a query for a specific date

    3. Click Apply to view all matching log entries. Notice that the Build a query to filter logs icon is annotated to indicate that filters are applied to the current view, and depict the number of filter conditions. If you specified a condition for the event time, the time picker drop-down list is disabled and displays the text “Absolute time set” to indicate that a time filter has already been applied. (Notice also that your browser URL reflects the sort order as well as the query that applies to your filtered view.)

      Build a query icon with an annotation

      The filter query is retained until you remove it from your current view. To do so, click the Build a query to filter logs icon again, click Delete all conditions and then click Apply.

  7. To refresh your view to show the latest events, click the Refresh icon Refresh icon. If currently applied, your filter or search context is retained when you refresh in this way (but won’t be if you choose to reload the browser page).

    Click Refresh to update your view of the logged events

  8. To export the data in the log entries to a file for reference or analysis, click the Export displayed logs icon Export displayed logs icon.

    The data is saved in JSON format to a file named logs.json in the default download location for your browser. This data relates only to what is currently displayed in your view.

  9. To close the log viewer, click Dashboard or any of the other tabs in the banner.

Accessing the log viewer for a failed flow

If a flow fails because of an error in the event, or in an action or some flow logic, you can view related error messages from the flow’s tile in the App Connect dashboard and can also access the built-in log viewer to view these messages in a wider context.

While the flow is still running, you can click the warning icon Flow error icon on the flow’s tile, or click View errors in the tile’s options menu [⋮] to open the “Error messages” panel. Then, click the Actions menu () for an entry and click View logs to go to the log viewer.

Clicking View logs in the Error messages panel to access the log viewer

When you access the log viewer in this way, you’ll see a filtered view of log entries that pertain to the errors for that flow only, based on the transaction ID.

Filtered view of log entries that pertain to the errors for a flow

If you’d like to see log entries for all your running flows, you can remove the Transaction ID filter query by clicking the Build a query to filter logs icon Build a query icon and then clicking Delete all conditions.

Accessing the log viewer for a batch process

In a similar manner to flow errors, you can also open a filtered view of your batch processing logs within the log viewer. This filtered view shows log messages from one batch process, and can be accessed as follows:

  1. From the App Connect dashboard, locate the tile for your batch processing flow. Then open the tile’s options menu [⋮] and click View batches to view a summary of completed and running batches.
  2. To view related logs in the log viewer, click options () in the Actions column and click View logs.

For more information, see How to use batch processing in IBM App Connect.

Extending your logging capability with the IBM Log Analysis with LogDNA service

You can extend the logging capability of your App Connect instance by provisioning and configuring an IBM Log Analysis with LogDNA service instance to aggregate logs from your applications and services in IBM Cloud. IBM Log Analysis with LogDNA offers benefits such as centralized logging, advanced filtering and searching, alerting, graphing, and archiving.

Once configured, App Connect logs from the logging service will be forwarded to your IBM Log Analysis with LogDNA instance.

For information about configuring IBM Log Analysis with LogDNA to receive logs from App Connect, see Monitoring and managing App Connect logs in IBM Log Analysis with LogDNA.

Join The Discussion

Your email address will not be published. Required fields are marked *