Welcome to the Azure Functions CodeLab for Teradata. The main objective of this lab is to learn how to create, publish and run an Azure function for Teradata BTEQ files that were migrated to Python scripts using the SnowConvert for Teradata tool.

What you'll learn

In this code lab you will learn:

BTEQ File

Let's assume that we have a Teradata database called "AdventureWorksDW" and a table called "DimCustomer". Use the following BTEQ file to select the top 10 customers and export the result set in a customer.csv file. You can find the BTEQ file in the resources/bteq_samples folder of this lab.

Replace the LOGON information for your logon information.

export_customers.bteq

.LOGON localhost/dbc, ***;
.EXPORT FILE = ./customers.csv
select top 10 * from AdventureWorksDW.DimCustomer;
.EXPORT RESET
.LOGOFF
.QUIT

Convert the BTEQ File

Let's convert the BTEQ file using the SnowConvert for Teradata Tool. The tool will convert the BTEQ file in a Python script.

Please, follow the next steps to convert the BTEQ file:

  1. Run SnowConvert for Teradata Tool
  2. Click on Let's begin
  3. Select the input folder where the "export_customers.bteq" is located.
  4. Select the output folder.
  5. Click on Start conversion.

After the successful conversion, we will check the Python script located in the output folder.

export_customers_BTEQ.py

#*** Generated code is based on the SnowConvert Python Helpers version 2.0.0 ***
 
import os
import sys
import snowconvert.helpers

from snowconvert.helpers import Export
from snowconvert.helpers import exec
con = None

def main():
   snowconvert.helpers.configure_log()
   con = snowconvert.helpers.log_on()
   #** MSC-ISSUE - MSCEWI1030 - NO TRANSFORMATION RULE FOUND FOR THIS BTEQ NODE 
   #.LOGON localhost/dbc, *** 
   
   Export.report("./customers.csv", ",")
   exec("""SELECT top 10 *
FROM AdventureWorksDW.PUBLIC.DimCustomer""")
   Export.reset()
   #Severity Low: Warning the following code is ignored by default:
   #.LOGOFF
  
   snowconvert.helpers.quit_application()

if __name__ == "__main__":
   main()

Some important things to check in the converted file are:

  1. The converted script uses the snowconvert.helpers module. This module has helper functions equivalent to the functions used in a BTEQ file. To install this module you can run: pip install snowconvert-helpers.
  2. The LOGON statement was replaced by snowconvert.helpers.log_on() Python function. This function use environment variables to log on into the Snowflake database. In the next section, we will learn how to set these environment variables.
  3. The LOGOFF statement was not converted. The logoff functionality is integrated into the snowconvert.helpers.quit_application() Python function.

Setting the Environment Variables

Before running the script, we must set the following environment variables. These variables will be use by the snowconvert.helpers.log_on() helper function to connect to the Snowflake database:

- SNOW_ACCOUNT
- SNOW_USER
- SNOW_PASSWORD
- SNOW_WAREHOUSE
- SNOW_ROLE
- SNOW_DATABASE

In this lab, we are going to learn how to configure environment settings in different ways when you are using a batch terminal session.

Option 1: Using export

You can use the export command:

export SNOW_ACCOUNT=ACCOUNT 
export SNOW_USER=USER 
export SNOW_PASSWORD=*** 
export SNOW_WAREHOUSE=WH 
export SNOW_ROLE=ROLE 
export SNOW_DATABASE=DATABASE

Option 2: Use .env file

You can use the create a .env file and include the environment variables:

SNOW_ACCOUNT=ACCOUNT 
SNOW_USER=USER 
SNOW_PASSWORD=*** 
SNOW_WAREHOUSE=WH 
NOW_ROLE=ROLE 
SNOW_DATABASE=DATABASE

Then use the following statement to load the environment variables in the terminal session:

export $(cat .env | xargs)

Option 3: Use inline declarations

Another way you can set the environment variables is by adding the previous variable declaration to the python statement:

SNOW_ACCOUNT=ACCOUNT SNOW_USER=USER SNOW_PASSWORD=*** SNOW_WAREHOUSE=WH SNOW_ROLE=ROLE SNOW_DATABASE=DATABASE python export_customers_BTEQ.py

Running the Python script

Follow these steps to run the Python script:

Step 1. Open a terminal

Step 2. Run the export_customers_BTEQ.py script as follows. In this case, we are using option #3 for setting the environment variables.

SNOW_ACCOUNT=ACCOUNT SNOW_USER=USER SNOW_PASSWORD=*** SNOW_WAREHOUSE=WH SNOW_ROLE=ROLE SNOW_DATABASE=DATABASE python export_customers_BTEQ.py

Once you have run the script described above, you can see the messages displayed in the terminal according to the progress of the script

alt-text-here

Step 3. After successful execution, you can find the customer.csv file generated by the script in bteq_samples folder. alt-text-here

Once the Python script is tested successfully, we can use it to create an Azure function in the next section.

Install the Azure Functions Core Tools

Install the Microsoft package repository GPG key, to validate package integrity:

curl https://packages.microsoft.com/keys/microsoft.asc | gpg --dearmor > microsoft.gpg
sudo mv microsoft.gpg /etc/apt/trusted.gpg.d/microsoft.gpg

Set up the APT source list before doing an APT update.

sudo sh -c 'echo "deb [arch=amd64] https://packages.microsoft.com/repos/microsoft-ubuntu-$(lsb_release -cs)-prod $(lsb_release -cs) main" > /etc/apt/sources.list.d/dotnetdev.list'

Start the APT source update:

sudo apt-get update

Install the Core Tools package:

sudo apt-get update
sudo apt-get install azure-functions-core-tools-3

Install the Azure CLI

curl -sL https://aka.ms/InstallAzureCLIDeb | sudo bash

You can see more details about configuring the local environment here

Create Function app on Azure

Follow the steps described here to create a function app

Create a Functions project folder for the Azure Function

Type the next line to go to the folder we will create it

cd tools/site/codelabs/

Create and navigate inside the folder we will use as Functions project

mkdir AzureFunctions
cd AzureFunctions

Create the Azure Function

Create a new Functions project by running:

func init

Then select Python as worker runtime.

To create a new function, run

func new

Select the HTTP trigger template as it's the type of function that we will cover in this tutorial, and follow the prompts to name your function. You can use the name: ExportCustomers

Copy the Exportation File

Now, we need to import the Python file who runs the extraction process, go to the path /tools/site/codelabs/assets/azure_functions_lab_for_teradata/resources/ and copy export_customers_BTEQ.py to the new function folder: /tools/site/codelabs/AzureFunctions/ExportCustomers

Once you have copied the Python file, we need to make some changes to __init__.py code. Copy and replace the next code

import logging

import sys

import azure.functions as func

from .export_customers_BTEQ import main as ExportCustomers


def main(req: func.HttpRequest, outputblob: func.Out[bytes]) -> func.HttpResponse:
    logging.info('Python HTTP trigger function processed a request.')

    try:
        ExportCustomers()
        return func.HttpResponse(f"Hello, extraction executed successfully.")
    except:
        return func.HttpResponse(f"Unexpected error: {sys.exc_info()}")
    data = open("./customers.csv", "rb").read()
    outputblob.set(data)

And do the same with the export_customers_BTEQ.py file

import os

import sys

import snowconvert.helpers

from snowconvert.helpers import Export
from snowconvert.helpers import exec
con = None

def main():
   snowconvert.helpers.configure_log()
   #con = snowconvert.helpers.log_on()

   SNOW_ACCOUNT = os.environ["SNOW_ACCOUNT"]
   SNOW_USER = os.environ["SNOW_USER"]
   SNOW_PASSWORD = os.environ["SNOW_PASSWORD"]
   SNOW_WAREHOUSE = os.environ["SNOW_WAREHOUSE"]
   SNOW_ROLE = os.environ["SNOW_ROLE"]
   SNOW_DATABASE = os.environ["SNOW_DATABASE"]

   con = snowconvert.helpers.log_on(SNOW_USER,SNOW_PASSWORD, SNOW_ACCOUNT, SNOW_DATABASE, SNOW_WAREHOUSE, SNOW_ROLE)
   
   #** MSC-ISSUE - MSCEWI1030 - NO TRANSFORMATION RULE FOUND FOR THIS BTEQ NODE 
   #.LOGON localhost/dbc, dbc **
    
   Export.report("./customers.csv", ",")
   exec("""SELECT top 10 *
FROM ADVENTUREWORKS_MOBILIZE.ADVENTUREWORKSDW.DimCustomer""")
   #Export.reset()
   #Severity Low: Warning the following code is ignored by default:
   #.LOGOFF
    
   #snowconvert.helpers.quit_application()

#if __name__ == "__main__":
#   main()
   con.close()

Changes on Function files

local.settings.json

On local.settings.json file, fill the attribute AzureWebJobsStorage with your Azure Storage Connection string

functions.json

On this file, add the next code on the bindings:

,
{
   "name": "outputblob",
   "type": "blob",
   "dataType": "binary",
   "path": "<CONTAINER_NAME>/customer.csv",
   "connection": "AzureWebJobsStorage",
   "direction": "out"
}

The CONTAINER_NAME is the name you gave the container created on your storage created before

requirements.txt

On this file, add the next line

snowconvert-helpers

Now we have the function as we need to make the extraction CSV file from Teradata. Go back to your command prompt (still in the Functions project folder). Make sure you're logged into your Azure account running

az login

and set your subscription with

az account set --subscription "SUBSCRIPTION_ID"

At this lab, we are making a remote build because we need to import snowconvert-helpers into the Python function. run

func azure functionapp publish APP_NAME --build remote

(APP_NAME being the name of the Function App you just created on Azure)

Don't be surprised if it takes a while. We are almost done!

To make sure everything works as expected, invoke the URL shown at the end of the Terminal. alt-text-here

Here you can find valuable references for this CodeLab: