Overview

Skill Level: Any Skill Level

This recipe helps you create, configure, compile, and execute a DataStage job that can write data from DB2 on to Google Cloud Storage.

Ingredients

1. IBM Infosphere Information Server Datastage 11.7.1 and above

2. DB2 database

3. Google Cloud Account

Step-by-step

  1. Description

    Google Cloud Storage is an online file storage web service for storing and accessing data on Google Cloud Platform infrastructure. The service combines the performance and scalability of Google’s cloud with advanced security and sharing capabilities.

    Information Server provides a native Google Cloud Storage Connector to read / write data from the files on Google Cloud Storage and integrate it into the ETL job design.

    We demonstrate a sample use case here which performs a write operation on Google Cloud Storage using Google Cloud Storage Connector. The datastage job includes a DB2 connector as source stage and a Google Cloud Storage Connector as target, where the data from DB2 is written to a file stored on Google Cloud Storage, moving data from OnPremise environment on to cloud.

    Screen-Shot-2020-01-22-at-1.43.16-pm

    In this recipe, I will show you how we can configure Google Cloud Storage Connector properties to write data to Google Cloud Storage from DB2.

  2. Configure DB2 connector as source

    1. Provide Database, Username and Password details for DB2, in the connection properties of DB2 Connector as follows:

    Screen-Shot-2020-01-22-at-1.52.00-pm

    2. Select Generate SQL option to auto-generate the select statement.

    3. Provide the DB2 Table name, where the data to be read is present.

    Screen-Shot-2020-01-22-at-1.52.16-pm

  3. Configure Google Cloud Storage Connector Properties to write to Google Cloud Storage

    1. Download the Google service account credentials json file and copy it to any location on Engine tier.

    https://cloud.google.com/iam/docs/creating-managing-service-account-keys

    2. Provide the fully qualified path to the above json file under Credentials file in the Connection Properties as follows:

    Screen-Shot-2020-01-22-at-2.02.48-pm-1

    3. Select the Write Mode as “Write” and provide the Bucket name to which the file has to be written. If bucket doesn’t already exist in the Google Cloud Storage, it can be created during the job run by selecting Create Bucket option as “Yes”.

    4. Provide the File Name property to which data has to be written from DB2.

    5. Choose the File format as Delimited. Six file formats are supported currently: Delimited, CSV, Parquet, Avro, JSON, Excel. Any file format can be selected as per the requirement.

    6. Once the file format is selected, optional formatting properties such as delimiters, quotation mark etc can be provided as per the usage requirement.

    Screen-Shot-2020-01-22-at-2.11.34-pm

    7. Under Input tab, provide the column name and type details of data, that needs to be written to Google Cloud Storage as follows:

    Screen-Shot-2020-01-22-at-2.12.03-pm

    8. Compile and run the job. The data from DB2 table is written to the file on the Google Cloud Storage.

    When Datastage is configured to run on multiple nodes, multiple files will created with node number appended to the filename as <filename>.0, <filename>.1.

    Screen-Shot-2020-01-22-at-2.14.09-pm

  4. References

    https://cloud.google.com/iam/docs/creating-managing-service-account-keys

    https://cloud.google.com/storage/

Join The Discussion