Welcome to the SnowConvert for Teradata CodeLab. The main purpose of this lab is to learn how to execute a Teradata database migration using the SnowConvert for Teradata tool and SnowConvert Studio platform.

What you'll learn

In this code lab you will learn:

Prerequisites

Note

For any request about licenses, tech doubts, etc, please feel free to contact us at support@mobilize.net.

You can find the SnowConvert for Teradata installer here. Please, check de system requirements and complete the form. Click on the Download button to start the download process.

alt-text-here

To install the SnowConvert tool, please follow the Step 2: Execute the Installer detailed here.

License Information

When you run the tool for the first time, you will be asked for a license. Please, check the steps to follow here.

When the application starts, you will view the Start Screen. Click on the Let's begin button.

alt-text-here

Now, you are ready to use the SnowConvert tool. But first, you will probably need to extract the DDL scripts from Teradata. You will see this in more detail in the next section.

Teradata Export Scripts

The first step for a migration is getting the code that you need to migrate. There are many ways to extract the code from your database. However, we recommend using the extraction scripts provided by Mobilize.Net.

All the source code for these scripts is open source and is available on Github

Installing the scripts

1 - Go to this repository and from the code option, select the drop down and use the Download ZIP option to download the code.

download_scripts

Decompress the ZIP file. The code for Teradata should be under the Teradata folder.

directory

Extracting DDLs files

The following are the steps to execute the DDL Code Generation. They should be executed in bash shell on a linux environment with access to bteq/tpt utilities.

2 - Modify the create_ddls.sh in the bin folder – Using a text editor modify the following parameters:

change_parameters

It is recommended to use the user ‘DBC' in the connection string but a user with sysadmin privileges should also work. Please run on a production-like environment with up to date statistics.

By default the script is setup to exclude system related databases and include all others. You can modify these to get the desired scope, including the operator that is used. Statements need to exclude spaces in the parameter values and values should be all UPPERCASE.

Do not remove the parentheses around the entire statement which are needed for compound logic. Do not use LIKE ANY clause for both as it can cause unexpected issues.

Example values:

(UPPER(T1.DATABASENAME) NOT IN ('ALL', 'TESTDB'));

(UPPER(T1.DATABASENAME) NOT IN ('ALL', 'TESTDB')) AND UPPER(T1.DATABASENAME) NOT LIKE ('TD_%'))

DDL Files

3 - After modifying, the create_ddls.sh file can be run from the command line to execute the extract from within the bin directory. The following files will be created in the output folder:

These files will contain the definitions of the objects specified by the file name.

Report Files

System Statistics

These files provide information around key system statistics and objects that can have a specific impact on conversion and migration activities.

Usage Report Files

These files provide information relevant to the sizing and usage of the Teradata system. These will not be created unless you uncomment the section for Creating Usage Reports

Data Profiling Files

These collect information about certain column types in which information about the data is required to understand certain aspects of the migration.

Invalid Objects Log

This file returns results showing any test failures for views that are not valid.

TPT Script Files

These files contain auto-generated scripts which can later be used in the data migration process.

4 - After a successful run, remove logon information from the top line of each of the files in the scripts folder as well as the create_ddls.sh file. Compress the entire Teradata Source Extract and you will be ready for running the conversion.

To run the SnowConvert for Teradata, follow the next steps:

  1. Open Mobilize.Net SnowConvert app
  2. On the Start screen, click on Let's begin
  3. Choose the input (where all the DDLs scripts from previous section are contained) and output folders. By default, the output folder will be named as the input name + _out". alt-text-here
  4. Before start the conversion process, you can configure the conversion settings. Click on the Settings button to see the configuration options. For more information about Conversion settings, review the Conversion Settings section above.
  5. Click on Start Conversion to start the migration process.
  6. During the conversion process, you will view the advance in the screen. When the process finishes, click on Next. alt-text-here
  7. Finally, you can view the Conversion Summary. On this screen, you can access the Reports, Logs and Output scripts. They are shortcuts to the location of the output folder. alt-text-here

Conversion Settings

Let's go to review the conversion settings in more detail. Conversion settings are grouped by:

General:

Here you can configure some general result settings. One of the most important is the encoding input files. alt-text-here

DB object names:

Here you can configure a custom Schema or Database name that will be applied to the converted objects. alt-text-here

Format conversion:

Here you can configure some default Teradata formats. This information is used to correctly convert numbers, times, dates, etc. alt-text-here

Output folders

Let's go to review the output folders in more detail:

Logs

This folder contains a file with the complete record of the conversion process. This file is very useful when you have a conversion problem. Send it to the support team in charge when you have a problem.

Output

This folder contains the migrated scripts for Snowflake. This folder is structed in the same way that the input folder is. For examples, schemas, tables, views, etc.

Reports:

This folder contains the reports of the migration results. These reports contain valuable information for analyzing the results of the migration. They are commonly used to estimate the effort of migration among other things. The most important of the reports is the Assessment Report, here a complete summary of the migration results is presented, indicating the app version, the conversion rates and a summary of the main problems encountered.

Handling large DDL files

It is a very good practice to split the DDL script files into smaller ones. Especially, if the files are too large. Having a file per each database object is recommended. Here some benefits of splitting files:

There is a Python script to split the extracted Teradata DDL files into one file per database object. This script also creates a folder per each schema. Let's go to learn how to use it.

 python Split.py "C:\\Teradata Source Extract DDL\object_extracts\DDL" "C:\\Teradata Source Extract DDL\object_extracts\DDL modified"

Encoding Issues

There are many standards for character encoding and each standard assigns different numbers to the same character, and it could be an issue using SnowConvert app with files written with a different character encoding.

This problem can be handled if you know the encoding used in the teradata files you are trying to convert.

After setting the input folder and the output folder and before starting the conversion, pushing the settings button who opens Conversion setting

alt-text-here

Once you opened Conversion setting, at Format conversion you can choose the file encoding to set the encoding of the input files.

alt-text-here

Finally, you Save the changes and proceed with the conversion.

Note:

For any queries, suggestions or comments, please do not hesitate to contact us at support@mobilize.net

On this section you will get in touch with SnowConvert Studio. SnowConvert Studio is a powerfull tool thats is used GitLab based repository, a centralized project container and a pre-setup dev environment which includes SnowConvert Tool.

Before start working in SnowConvert Studio, a GitHub account is needed, go to GitHub and click on Sign Up

Git Hub main page

And follow the instructions by signing up with the email that will be used in SnowConvert Studio.

Please make sure you have received and accepted the invitation to join SnowConvert Studio:

Invitation

If necessary, sign up in Microsoft portal, follow the instructions and grant permissions to Mobilize.NET.

Access SnowConvert Studio. Sign in using Azure AD and the suscribed email.

Azure

If you have no projects created, you should see the following:

SCS

Click on Create a Project –> Create Blank Project, type in your project name and click on Create Project

Blank Project

A pre-setup repisotory will be created. To convert your DDL scripts select Gitpod from the dropdown menu:

Select_gitpod

And click on it, if necessary, click on Enable Gitpod and click on Giptod once again:

Enable_Gitpod

Click on Continue with Github:

Gitpod

Sign in with your GitHub credentials and click on Authorize gitpod_io:

Aut_gitpod

If necessary Authorize with snowconvertstudio.net:

Aut_scAut_sc1

And... Congratulations, Welcome to Gitpod, a pre-setup dev environment to data engineering, Teradata code conversion to Snowflake, scripting, and much more!:

Gitpod_dev_env

To convert yor DDLs files, please drag or copy them to the Source –> src folder within your gitpod environment:

adding_files

Once your files are there, right click on src folder and then click on Migrate Teradata, it will ask you for an active conversion license (for further details please contact support@mobilize.net):

migrate

Once you introduce the license, you will see the conversion process at the terminal. It might take some minutes and upon finishing a new folder called TargetMigratedCode will be created. It contains the same output folders and files as mentioned in Section 6.

migrated

With that we complete the lab with the process of extracting and converting the Teradata Files using the Snowconvert tool and Snowconvert Studio.

Here you can find valuable references for this CodeLab: