Overview

Skill Level: Any Skill Level

This recipes provides an overview of migrating large csv data files to IBM DB2 on cloud using an dockerization of IBM Lift CLI tool. This ensures that no data is stored locally and data is migrated securely from one Database to another.

Ingredients

1. Download Lift CLI tool from https://www.lift-cli.cloud.ibm.com/

2. Download & Install docker.

Step-by-step

  1. Introduction to Lift CLI Commands

    1. Lift Cli Extract

    Extract data from source table to a csv file using “lift extract” command

    lift extract –properties-file <path/file>

    Properties File:

    –source-schema=<Database schema in Db2 Warehouse on Cloud>

    –source-table=<Database Table in Db2 Warehouse on Cloud>

    –source-database=<Database  in Db2 Warehouse on Cloud>

    –source-host=<Database host of your Db2 Warehouse on Cloud system>

    –source-user=<User in Db2 Warehouse on Cloud>

    –source-password=<password>

    –source-database-port=<Database port of your Db2 Warehouse on Cloud system>

    –source-database-type=db2

    –file=<path/CSV file to store extracted data>

     

    2. Lift Cli Put

    Upload the extracted data CSV file to the cloud landing zone using using “lift put” command

    lift put –properties-file <path/file>

    Properties File:

                –file=< path/CSV file to copy to Db2 Warehouse on Cloud >

                –target-user=<User in Db2 Warehouse on Cloud>

                –target-password=<password>

                –target-host=<Database host of your Db2 Warehouse on Cloud system>

    3. Lift Cli Load
     

    Load the data stored in csv files to DB2 Warehouse on cloud using “lift load”         command:

    lift load –properties-file <file>

    Properties File:

                –filename=<CSV file to load data>

                –target-schema=<Database schema in Db2 Warehouse on Cloud>

                –target-table= <Database Table in Db2 Warehouse on Cloud>

                –file-origin=extract-db2

                –target-user=<User in Db2 Warehouse on Cloud>

                –target-password=<password>

                –target-host=<Database host of your Db2 Warehouse on Cloud system>

    4. Lift Cli Remove
     

    Removes the data csv files from cloud landing zone using “lift rm” command:

    lift rm –properties-file <file>

    Properties File:

                –filename=<File1> <File2> <File3>

                –target-user=<User in Db2 Warehouse on Cloud>

                –target-password=<password>

                –target-host=<Database host of your Db2 Warehouse on Cloud system>

  2. Download Lift CLI

    1)    Download Lift Cli for Linux using the link https://www.lift-cli.cloud.ibm.com/

    2)    Save lift-cli-linux.zip to the local file system.

    3)    Copy and paste the token to a text pad, it is required at the time of installation

    download_lift_cli

     

     

  3. Preparing DBBackup Folder

    1. log-in to Linux system/VM as root.

    2. Ensure docker is installed, if not installed, follow the link https://docs.docker.com/engine/install/ubuntu/ to install.

    3. Create a folder <DBBackup>

            3.1 Create 2 folder <csv,src,lift-cli> within DBBackup folder

    4. Un-Zip lift-cli-linux.zip to the local file system

    5. Cut and paste lift-cli-install folder inside the DBBackup folder (do not copy META-INF folder).

    6. After copy, the following will be the contents of the DBBackup folder

    db_backup_folder_structure

    The csv folder is used to store extracted data as CSV files from the source table. The properties folder stores the properties file used by the extract, put, load and rm commands. The lift-cli folder is the base folder of lift-cli installation.

    7. Create following Shell script within src folder.

      7.1 main.sh

     #!/bin/bash

    echo Start install lift cli
    /opt/ibm/backupapp/lift-cli-install/install /opt/ibm/backupapp/lift-cli <Token>
    echo End install lift cli

    echo Start Extracting the Table Data to csv
    /opt/ibm/iot/backupapp/src/extract.sh
    echo End Extracting the Table Data to csv

    echo Start load Table Data csv file from source to target
    /opt/ibm/iot/backupapp/src/put.sh
    echo End load Table Data csv file from source to target

    echo Start view CSV file in target system
    /opt/ibm/iot/backupapp/src/ls.sh
    echo End view CSV file in target system

    echo Start load Table Data csv file to Database
    /opt/ibm/iot/backupapp/src/load.sh
    echo End load Table Data csv file to Database

    echo Start remove CSV file in target system
    /opt/ibm/iot/backupapp/src/rm.sh
    echo End remove CSV file in target system

    echo Terminate our shell script
    exit 0

     7.2  extract.sh

    #!/bin/bash

    echo Start Extracting the Table Data to csv file
    /opt/ibm/iot/backupapp/lift-cli/bin/lift extract –properties-file /opt/ibm/backupapp/properties/extract.txt
    echo End Extracting the Table Data to csv file

     7.3 put.sh

    #!/bin/bash

    echo Start putting csv file
    /opt/ibm/backupapp/lift-cli/bin/lift put –properties-file /opt/ibm/backupapp/properties/put.txt
    echo End putting csv file

     7.4  ls.sh

    #!/bin/bash
    echo Start Show Files
    /opt/ibm/iot/backupapp/lift-cli/bin/lift ls –properties-file /opt/ibm/backupapp/roperties/ls.txt
    echo End Show Files

     7.5 load.sh

    #!/bin/bash

    echo Start load csv file to table
    /opt/ibm/iot/backupapp/lift-cli/bin/lift load –properties-file /opt/ibm/backupapp/properties/load.txt
    echo End load csv file to table

     7.6 rm.sh

    #!/bin/bash
    echo Start Remove Files
    /opt/ibm/backupapp/lift-cli/bin/lift rm –properties-file /opt/ibm/backupapp/properties/rm.txt
    echo End Remove Files

    8. Create extract.txt,put.txt,ls.txt,load.txt and rm.txt in the properties folder. Following is the template.

    –source-user=<srcUser>
    –source-password=<srcPass>
    –source-host=<srcDbHost>
    –source-database=<srcDbName>
    –source-database-port=<portnumber>
    –source-schema=<srcSchema>
    –source-table=<srcTable>
    –source-database-type=db2
    –file=/opt/ibm/backupapp/csv/<filename>

  4. Preparing Docker File

    1. Create a Dockerfile in DBBackup folder.

    2. Add the following content to the file.

    FROM centos:centos7

    # Install dependencies

     

    # Set the working directory

    WORKDIR /opt/ibm/backupapp

     

    # Copy the current directory contents into the container at location /opt/ibm/backupapp

    COPY  /opt/ibm/backupapp

     

    # Making starting script executable

    RUN chmod +x src/main.sh

     

    # Entrypoint for the application. It will start the application .

    ENTRYPOINT [“/opt/ibm/backupapp/src/main.sh”]

    A Docker file is a text document (without a file extension) that contains the instructions to set up an environment for a Docker container. You can build a Docker image using a Docker file. The command docker build. builds a Docker image using the Docker file in the directory that this command is execute.

    Most docker files start from a ‘parent’ image. The parent image is added through the FROM keyword. Your image builds upon the parent image. FROM centos:centos7

    You then set the working directory in the container with WORKDIR. WORKDIR /opt/ibm/backupapp sets the current directory to /opt/ibm/backupapp when the container starts running. Any RUN, CMD, ADD, COPY, or ENTRYPOINT command will be executed in the specified working directory. If the WORKDIR command is not written in the Dockerfile, it will automatically be created by the Docker compiler.

    Transfer files into the container from the source m/c, we use COPY . /opt/ibm/backupapp command. The COPY command copies all the files from our current directory (the one which contains the Dockerfile) into the /opt/ibm/backupapp. Our container will now contain the OS and the files from our local directory with the working directory set to. /opt/ibm/backupapp.

    The RUN command executes when we build the image and any additional dependencies or packages are usually installed using the RUN command.

    The ENTRYPOINT is to set the image’s main command, allowing that image to be run as though it was that command

  5. Building & Executing Docker Image

    Make sure file permissions (read ,write and execute) are updated to all the child folders of DBBackup

    1. Change directory to DBBackup and execute the following commands to build docker image

    docker build . -t dbbackv1

    2. Verify docker images

    execute the following command to verify docker image

    docker images

    3. Execute docker

    execute the following command to migrate data

    docker run dbbackv1

    4. Check the logs

    Execute the following command to get the CONTAINER ID

    docker ps -a

    Execute the following command to get the logs

    docker logs <CONTAINER ID>

Join The Discussion