2021 Call for Code Awards: Live from New York, with SNL’s Colin Jost! Learn more

Import your IBM Cloud Internet Services logs into LogDNA


Import your IBM Cloud Internet Services logs to your LogDNA service instance to display them in a single platform.


IBM Cloud Internet Services Enterprise-level plans offer a Logpush feature, which sends at least one log package (on a .gz file) to a bucket on IBM Cloud Object Storage every five minutes. For those logs, there is a service called IBM Log Analysis with LogDNA that can receive all logs and display them in a single platform. (You can send logs from your Kubernetes cluster, virtual machines, and such.) To import all logs into LogDNA, you must set up a serverless function to check IBM Cloud Object Storage and send the logs to LogDNA every 3 minutes. It uses an Action and a Trigger (with a Cron job) to run the job automatically.


Architecture flow diagram

  1. IBM Cloud Internet Services sends log packages to the IBM Cloud Object Storage bucket.
  2. The script downloads the log package from the bucket and reads the log package.
  3. The script sends the logs to LogDNA.


Find the detailed steps for this pattern in the README.md file. The steps show you how to:

  1. Provision IBM Cloud Object Storage and IBM Log Analysis with LogDNA.
  2. Add services credentials to the source code.
  3. Deploy the source code as Action and Trigger on the IBM Cloud Functions serverless platform.
  4. Set up the Logpush feature on IBM Cloud Internet Services.