Import your IBM Cloud Internet Services logs to your LogDNA service instance to display them in a single platform.
IBM Cloud Internet Services Enterprise-level plans offer a Logpush feature, which sends at least one log package (on a
.gz file) to a bucket on IBM Cloud Object Storage every five minutes. For those logs, there is a service called IBM Log Analysis with LogDNA that can receive all logs and display them in a single platform. (You can send logs from your Kubernetes cluster, virtual machines, and such.) To import all logs into LogDNA, you must set up a serverless function to check IBM Cloud Object Storage and send the logs to LogDNA every 3 minutes. It uses an Action and a Trigger (with a Cron job) to run the job automatically.
- IBM Cloud Internet Services sends log packages to the IBM Cloud Object Storage bucket.
- The script downloads the log package from the bucket and reads the log package.
- The script sends the logs to LogDNA.
Find the detailed steps for this pattern in the README.md file. The steps show you how to:
- Provision IBM Cloud Object Storage and IBM Log Analysis with LogDNA.
- Add services credentials to the source code.
- Deploy the source code as Action and Trigger on the IBM Cloud Functions serverless platform.
- Set up the Logpush feature on IBM Cloud Internet Services.