Sunday 17 December 2017

Deploying to Azure Integration Services Preview (ADF V2)

Case
I just created an Integration Services Catalog in Azure Data Factory V2, but how do I deploy SSIS packages to this new catalog in Azure.
Azure Integration Services























Solution
The Integration Services catalog in Azure doesn't support Windows Authentication like the on premises version. Therefore the PowerShell deployment script and the deployment in SSDT won't work without some changes.
For this example I will focus on the SQL Server Authentication for which you can use the same user that you used to configure the catalog in Azure or create a new SQL user.

Solution 1: SSDT
First make sure you have the latest version of SSDT 2015 (17.4 or higher) or SSDT 2017 (15.5.0 or higher). If you already have SSDT 2017 make sure you first remove any installed Visual Studio extension of SSRS or SSAS projects before installing SSDT. For downloads and more details see this page.

If you use an older version that doesn't have the option for SQL Server Authentication you will get an error: Failed to connect to server bitools2.database.windows.net. (Microsoft.SqlServer.ConnectionInfo)
Windows logins are not supported in this version of SQL Server. (Microsoft SQL Server, Error: 40607)

Windows logins are not supported
in this version of SQL Server.

























After updating SSDT you will see a new Integration Services Deployment Wizard, which supports three new authentication methods. After filling in the Server name (which is the URL of your Azure SQL server that hosts your SSISDB), choose the SQL Server Authentication method. Fill in the username and password and click on the Connect button. After that you can use the Browse button to browse your Integration Services catalog in Azure.
New Integration Services Deployment Wizard

























Solution 2: PowerShell
If you regularly deploy SSIS projects you probably use a PowerShell script. The 'old' deployment scripts uses Windows Authentication to create a 'System.Data.SqlClient.SqlConnection' connection to the master database on the server that hosts the SSISDB. See this snippet:
# PowerShell code snippet for on premises deployment
#################################################
############ CONNECT TO SSIS SERVER #############
#################################################
# First create a connection to SQL Server
$SqlConnectionstring = "Data Source=$($SsisServer);Initial Catalog=master;Integrated Security=SSPI;"
$SqlConnection = New-Object System.Data.SqlClient.SqlConnection $SqlConnectionstring

We need to change two things. First we need to change the connectionstring to SQL Server Authentication by adding a username and password and removing 'Integrated Security=SSPI;'. Secondly we need to change the database in the Initial Catalog part. This should be SSISDB instead of master.
# PowerShell code snippet for on premises deployment
#################################################
##################### SERVER ####################
#################################################
Write-Host "Connecting to Azure SQL DB server $($SsisServer)"
# Create a connectionstring for the Azure DB Server
# Make sure you use SSISDB as the Initial Catalog!
$SqlConnectionstring = "Data Source=$($SsisServer);User ID=$($SSISDBUsername);Password=$($SSISDBPassword);Initial Catalog=SSISDB;"


Selecting the SSISDB database also applies when you want to use SSMS to connect to your catalog in Azure. Below the complete script for deploying ISPAC files. The solution contains two files. The first is the file with all the parameters. This changes per project. The last line executes the second script.
# PowerShell code snippet for on premises deployment
#PowerShell: finance.ps1
#################################################################################################
# Change source, destination and environment properties
#################################################################################################
 
# Source
$IspacFilePath = "d:\projects\Finance\bin\Development\Finance.ispac"
 
# Destination
$SsisServer ="bitools2.database.windows.net"
$SSISDBUsername = "Joost"
$SSISDBPassword = "5ecrtet"
$FolderName = "Finance"
$ProjectName = ""
 
# Environment
$EnvironmentName = "Generic"
$EnvironmentFolderName = "Environments"
 
#################################################################################################
# Execute generic deployment script
. "$PSScriptRoot\generalAzureDeployment.ps1" $IspacFilePath $SsisServer $SSISDBUsername $SSISDBPassword $FolderName $ProjectName $EnvironmentName $EnvironmentFolderName

The second file is the generic scripts which will be the same for each project. If you want to make any changes to the script you now only need to maintain one generic PowerShell scripts instead of dozens of copies for each project.

# PowerShell code snippet for on premises deployment
#PowerShell: generalAzureDeployment.ps1
################################
########## PARAMETERS ##########
################################ 
[CmdletBinding()]
Param(
    # IsPacFilePath is required
    [Parameter(Mandatory=$True,Position=1)]
    [string]$IspacFilePath,
     
    # SsisServer is required 
    [Parameter(Mandatory=$True,Position=2)]
    [string]$SsisServer,

    # SSISDB Username is required 
    [Parameter(Mandatory=$True,Position=3)]
    [string]$SSISDBUsername,

    # SSISDB Password is required 
    [Parameter(Mandatory=$True,Position=4)]
    [string]$SSISDBPassword,
     
    # FolderName is required
    [Parameter(Mandatory=$True,Position=5)]
    [string]$FolderName,
     
    # ProjectName is not required
    # If empty filename is used
    [Parameter(Mandatory=$False,Position=6)]
    [string]$ProjectName,
     
    # EnvironmentName is not required
    # If empty no environment is referenced
    [Parameter(Mandatory=$False,Position=7)]
    [string]$EnvironmentName,
     
    # EnvironmentFolderName is not required
    # If empty the FolderName param is used
    [Parameter(Mandatory=$False,Position=8)]
    [string]$EnvironmentFolderName
)
 
# Replace empty projectname with filename
if (-not $ProjectName)
{
  $ProjectName = [system.io.path]::GetFileNameWithoutExtension($IspacFilePath)
}
# Replace empty Environment folder with project folder
if (-not $EnvironmentFolderName)
{
  $EnvironmentFolderName = $FolderName
}

# Mask the password to show something on
# screen, but not the actual password
# This is for testing purposes only.
$SSISDBPasswordMask = $SSISDBPassword -replace '.', '*'

clear
Write-Host "========================================================================================================================================================"
Write-Host "==                                                         Used parameters                                                                            =="
Write-Host "========================================================================================================================================================"
Write-Host "Ispac File Path        : " $IspacFilePath
Write-Host "SSIS Server            : " $SsisServer
Write-Host "SQL Username           : " $SSISDBUsername
Write-Host "SQL Password           : " $SSISDBPasswordMask 
Write-Host "Project Folder Path    : " $FolderName
Write-Host "Project Name           : " $ProjectName
Write-Host "Environment Name       : " $EnvironmentName
Write-Host "Environment Folder Path: " $EnvironmentFolderName
Write-Host "========================================================================================================================================================"
Write-Host ""
 
# Stop the script if an error occurs
$ErrorActionPreference = "Stop"

#################################################
##################### ISPAC #####################
#################################################
# Check if ispac file exists
if (-Not (Test-Path $IspacFilePath))
{
    Throw  [System.IO.FileNotFoundException] "Ispac file $IspacFilePath doesn't exists!"
}
else
{
    $IspacFileName = split-path $IspacFilePath -leaf
    Write-Host "Ispac file" $IspacFileName "found"
}



#################################################
############### ADD SSIS ASSEMBLY ###############
#################################################
# Add SSIS assembly so you can do SSIS stuff in PowerShell
# The number 14.0.0.0 refers to SQL Server 2017
# 13.0.0.0 to SQL Server 2016, 12.0.0.0 to SQL
# Server 2014 and 11.0.0.0 to SQL Server 2012
$SsisNamespace = "Microsoft.SqlServer.Management.IntegrationServices"
Add-Type -AssemblyName "$($SsisNamespace), Version=14.0.0.0, Culture=neutral, PublicKeyToken=89845dcd8080cc91" 



#################################################
##################### SERVER ####################
#################################################
Write-Host "Connecting to Azure SQL DB server $($SsisServer)"
 
# Create a connectionstring for the Azure DB Server
# Make sure you use SSISDB as the Initial Catalog!
$SqlConnectionstring = "Data Source=$($SsisServer);User ID=$($SSISDBUsername);Password=$($SSISDBPassword);Initial Catalog=SSISDB;"

# Create a connection object
$SqlConnection = New-Object System.Data.SqlClient.SqlConnection $SqlConnectionstring

# Check if the connection works
Try
{
    $SqlConnection.Open();
    Write-Host "Connected to Azure SQL DB server $($SsisServer)"
}
Catch [System.Data.SqlClient.SqlException]
{
    Throw  [System.Exception] "Failed to connect to Azure SQL DB server $($SsisServer), exception: $($_)"
}

# Create the Integration Services object
$IntegrationServices = New-Object $SsisNamespace".IntegrationServices" $SqlConnection
 
# Check if SSISDB connection succeeded
if (-not $IntegrationServices)
{
  Throw  [System.Exception] "Failed to connect to SSISDB on $($SsisServer)"
}
else
{
   Write-Host "Connected to SSISDB on $($SsisServer)"
}



#################################################
#################### CATALOG ####################
#################################################
# Create object for SSISDB Catalog
$Catalog = $IntegrationServices.Catalogs["SSISDB"]
 
# Check if the SSISDB Catalog exists
if (-not $Catalog)
{
    # Catalog doesn't exists. The user should create it manually.
    # It is possible to create it, but that shouldn't be part of
    # deployment of packages.
    # Also make sure the catalog is SSISDB and not master or any
    # other database.
    Throw  [System.Exception] "SSISDB catalog doesn't exist. Create it manually!"
}
else
{
    Write-Host "Catalog SSISDB found"
}

 
 
#################################################
#################### FOLDER #####################
#################################################
# Create object to the (new) folder
$Folder = $Catalog.Folders[$FolderName]
 
# Check if folder already exists
if (-not $Folder)
{
    # Folder doesn't exists, so create the new folder.
    Write-Host "Creating new folder" $FolderName
    $Folder = New-Object $SsisNamespace".CatalogFolder" ($Catalog, $FolderName, $FolderName)
    $Folder.Create()
}
else
{
    Write-Host "Folder" $FolderName "found"
}


 
#################################################
#################### PROJECT ####################
#################################################
# Deploying project to folder
if($Folder.Projects.Contains($ProjectName)) {
    Write-Host "Deploying" $ProjectName "to" $FolderName "(REPLACE)"
}
else
{
    Write-Host "Deploying" $ProjectName "to" $FolderName "(NEW)"
}
# Reading ispac file as binary
[byte[]] $IspacFile = [System.IO.File]::ReadAllBytes($IspacFilePath) 
$Folder.DeployProject($ProjectName, $IspacFile) | Out-Null 
$Project = $Folder.Projects[$ProjectName]
if (-not $Project)
{
    # Something went wrong with the deployment
    # Don't continue with the rest of the script
    return ""
}



#################################################
################## ENVIRONMENT ##################
#################################################
# Check if environment name is filled
if (-not $EnvironmentName)
{
    # Kill connection to SSIS
    $IntegrationServices = $null
 
    # Stop the deployment script
    Return "Ready deploying $IspacFileName without adding environment references"
}
 
# Create object to the (new) folder
$EnvironmentFolder = $Catalog.Folders[$EnvironmentFolderName]
 
# Check if environment folder exists
if (-not $EnvironmentFolder)
{
  Throw  [System.Exception] "Environment folder $EnvironmentFolderName doesn't exist"
}
 
# Check if environment exists
if(-not $EnvironmentFolder.Environments.Contains($EnvironmentName))
{
  Throw  [System.Exception] "Environment $EnvironmentName doesn't exist in $EnvironmentFolderName "
}
else
{
    # Create object for the environment
    $Environment = $Catalog.Folders[$EnvironmentFolderName].Environments[$EnvironmentName]
 
    if ($Project.References.Contains($EnvironmentName, $EnvironmentFolderName))
    {
        Write-Host "Reference to" $EnvironmentName "found"
    }
    else
    {
        Write-Host "Adding reference to" $EnvironmentName
        $Project.References.Add($EnvironmentName, $EnvironmentFolderName)
        $Project.Alter() 
    }
}


 
#################################################
############## PROJECT PARAMETERS ###############
#################################################
$ParameterCount = 0
# Loop through all project parameters
foreach ($Parameter in $Project.Parameters)
{
    # Get parameter name and check if it exists in the environment
    $ParameterName = $Parameter.Name
    if ($ParameterName.StartsWith("CM.","CurrentCultureIgnoreCase"))
    {
        # Ignoring connection managers
    }
    elseif ($ParameterName.StartsWith("INTERN_","CurrentCultureIgnoreCase"))
    {
        # Internal parameters are ignored (where name starts with INTERN_)
        Write-Host "Ignoring Project parameter" $ParameterName " (internal use only)"
    }
    elseif ($Environment.Variables.Contains($Parameter.Name))
    {
        $ParameterCount = $ParameterCount + 1
        Write-Host "Project parameter" $ParameterName "connected to environment"
        $Project.Parameters[$Parameter.Name].Set([Microsoft.SqlServer.Management.IntegrationServices.ParameterInfo+ParameterValueType]::Referenced, $Parameter.Name)
        $Project.Alter()
    }
    else
    {
        # Variable with the name of the project parameter is not found in the environment
        # Throw an exeception or remove next line to ignore parameter
        Throw  [System.Exception]  "Project parameter $ParameterName doesn't exist in environment"
    }
}
Write-Host "Number of project parameters mapped:" $ParameterCount



#################################################
############## PACKAGE PARAMETERS ###############
#################################################
$ParameterCount = 0
# Loop through all packages
foreach ($Package in $Project.Packages)
{
    # Loop through all package parameters
    foreach ($Parameter in $Package.Parameters)
    {
        # Get parameter name and check if it exists in the environment
        $PackageName = $Package.Name
        $ParameterName = $Parameter.Name
        if ($ParameterName.StartsWith("CM.","CurrentCultureIgnoreCase"))
        {
            # Ignoring connection managers
        }
        elseif ($ParameterName.StartsWith("INTERN_","CurrentCultureIgnoreCase"))
        {
            # Internal parameters are ignored (where name starts with INTERN_)
            Write-Host "Ignoring Package parameter" $ParameterName " (internal use only)"
        }
        elseif ($Environment.Variables.Contains($Parameter.Name))
        {
            $ParameterCount = $ParameterCount + 1
            Write-Host "Package parameter" $ParameterName "from package" $PackageName "connected to environment"
            $Package.Parameters[$Parameter.Name].Set([Microsoft.SqlServer.Management.IntegrationServices.ParameterInfo+ParameterValueType]::Referenced, $Parameter.Name)
            $Package.Alter()
        }
        else
        {
            # Variable with the name of the package parameter is not found in the environment
            # Throw an exeception or remove next line to ignore parameter
            Throw  [System.Exception]  "Package parameter $ParameterName from package $PackageName doesn't exist in environment"
        }
    }
}
Write-Host "Number of package parameters mapped:" $ParameterCount



#################################################
##################### READY #####################
#################################################
# Kill connection to SSIS
$IntegrationServices = $null
 
 
Return "Ready deploying $IspacFileName "
The script

Thursday 14 December 2017

Azure Integration Services Preview (ADF V2)

Case
I just created an Azure Data Factory V2 to start with SSIS in the cloud, but I cannot find the SSIS options in ADF. How do I configure SSIS in Azure?
Azure Data Factory V2





















Solution
At the moment SSIS (ADF V2) is still in the preview. The user interface for SSIS is not yet available in the Azure portal (however it is visible in this video, but probably only for Microsoft).

For now you are stuck with a little PowerShell scripting. Before you start make sure you have an ADF V2 and an Azure Database Server available. Preferably in the same Azure region. ADF V2 is only available in East US, East US2 and West Europe. Since I'm from the Netherlands I selected West Europe.

PowerShell ISE
If you never used PowerShell for Azure code then you first need to start Windows PowerShell ISE as Administrator to install the Azure Module. Execute the following command:
# PowerShell code 
Install-Module AzureRM

And if you did work with Azure PowerShell before then you probably have to update the Azure Module by adding the parameter -Force at the end
Install-Module AzureRM




















Parameters
This PowerShell script first starts with a 'parameter' section to provide all details necessary for the script to run.

First start with the name of your subscription. If you are not sure which one it is then you can look it up in the overview page of your Azure Data Factory under Subscription name.
Subscription name in ADF overview page














#################################################
################## PARAMETERS ###################
################################################# 
$SubscriptionName = "mySubscriptionName"

To store the SSISDB we need to provide a database server URL and the server admin and its password. The URL can be found on the SQL database overview page and the user can be found on the SQL server overview page.
#Provide login details for the existing database server
$CatalogServerEndpoint = "myDBServer.database.windows.net"
$DBUser = "Joost"
$DBPassword = "5ecret!

Next we need to provide the details about our newly created Azure Data Factory V2 environment. You can find all the required information on the overview page of ADF. The location is either WestEurope or EastUs or EastUs2.
Azure Data Factory V2 overview page













# Provide details about your existing ADF V2
$DataFactoryName = "bitools"
$ResourceGroupName = "Joost_van_Rossum"
$Location = "WestEurope"

The last part is to configure the Integration Runtime. The Catalog Pricing Tier is the database size of your SSISDB database. It should be Basic or S1, S2, S3... S12, etc. The node size should be Standard_A4_v2, Standard_A8_v2, Standard_D1_v2, Standard_D2_v2, Standard_D3_v2 or Standard_D4_v2. After the preview phase more sizes will be available. Prices can be found here.
# Provide details for the new Integration Runtime
$IntegrationRuntimeName = "SSISJoostIR"
$IntegrationRuntimeDescription = "My First Azure Integration Catalog"
$CatalogPricingTier = "Basic" # S0, S1, S2, S3
$IntegrationRuntimeNodeSize = "Standard_A4_v2"
$IntegrationRuntimeNodeCount = 2
$IntegrationRuntimeParallelExecutions = 2


The script
And now the script itself. It starts with a setting to stop after an error (not the default setting) and a login to Azure. When executing Login-AzureRmAccount it will show a login popup. Login with your azure account.
Login to Azure
























#################################################
################## THE SCRIPT ###################
#################################################
$ErrorActionPreference = "Stop"

# Login to Azure (a pop-up will appear)
Login-AzureRmAccount 

After the login we need to select the right subscription. The Out-Null will prevent showing all properties of your subscription to the screen.
# Select the right subscription
Select-AzureRmSubscription -SubscriptionName $SubscriptionName | Out-Null 

And this will create a new credential with the user id and password. We need it for the next command.
# Create Database creditial with user id and password
$SecureDBPassword = ConvertTo-SecureString $DBPassword -AsPlainText -Force
$ServerCreds = New-Object System.Management.Automation.PSCredential($DBUser, $SecureDBPassword)

This command will create the Integration Runtime environment in Azure Data Factory.
# Create the Integration Runtime
Write-Host "Creating your integration runtime."
Set-AzureRmDataFactoryV2IntegrationRuntime  -ResourceGroupName $ResourceGroupName `
                                            -DataFactoryName $DataFactoryName `
                                            -Name $IntegrationRuntimeName `
                                            -Type Managed `
                                            -CatalogServerEndpoint $CatalogServerEndpoint `
                                            -CatalogAdminCredential $ServerCreds `
                                            -CatalogPricingTier $CatalogPricingTier `
                                            -Description $IntegrationRuntimeDescription `
                                            -Location $Location `
                                            -NodeSize $IntegrationRuntimeNodeSize `
                                            -NodeCount $IntegrationRuntimeNodeCount `
                                            -MaxParallelExecutionsPerNode $IntegrationRuntimeParallelExecutions 

After creating the Integration Runtime environment we need to start it. Only then you can use it. Starting the environment takes 20 to 30 minutes! There is also an Stop-AzureRmDataFactoryV2IntegrationRuntime method which takes the same parameters and takes 2 to 3 minutes.
# Start the Integration Runtime (takes 20 to 30 minutes)
Write-Warning "Starting your integration runtime. This command takes 20 to 30 minutes to complete."
Start-AzureRmDataFactoryV2IntegrationRuntime -ResourceGroupName $ResourceGroupName `
                                             -DataFactoryName $DataFactoryName `
                                             -Name $IntegrationRuntimeName `
                                             -Force


Connecting with SSMS
Now we can use SQL Server Management Studio (SSMS) to connect to our newly created SSISDB in Azure. A little different compared to on-premises: you need to click on the Options button and select the SSISDB first. Otherwise you won't see the Integration Services Catalog.
Connecting to the SSISDB in Azure

Spot the differences









































In the next post I will show the deployment to the Integration Services Catalog in Azure. And now the complete script for copy and paste purposes.
#################################################
################## PARAMETERS ###################
################################################# 
$SubscriptionName = "mySubscriptionName"

# Provide login details for the existing database server
$CatalogServerEndpoint = "myDBServer.database.windows.net"
$DBUser = "Joost"
$DBPassword = "5ecret!"

# Provide details about your existing ADF V2
$DataFactoryName = "bitools"
$ResourceGroupName = "Joost_van_Rossum"
$Location = "WestEurope" # or EastUs/EastUs2

# Provide details for the new Integration Runtime
$IntegrationRuntimeName = "SSISJoostIR"
$IntegrationRuntimeDescription = "My First Azure Integration Catalog"
$CatalogPricingTier = "Basic" # S0, S1, S2, S3
$IntegrationRuntimeNodeSize = "Standard_A4_v2"
$IntegrationRuntimeNodeCount = 2
$IntegrationRuntimeParallelExecutions = 2

# In public preview, only Standard_A4_v2, Standard_A8_v2, Standard_D1_v2,
# Standard_D2_v2, Standard_D3_v2, Standard_D4_v2 are supported.

#################################################
################## THE SCRIPT ###################
#################################################
$ErrorActionPreference = "Stop"

# Login to Azure (a pop-up will appear)
Login-AzureRmAccount 

# Select the right subscription
Select-AzureRmSubscription -SubscriptionName $SubscriptionName | Out-Null 

# Create Database creditial with user id and password
$SecureDBPassword = ConvertTo-SecureString $DBPassword -AsPlainText -Force
$ServerCreds = New-Object System.Management.Automation.PSCredential($DBUser, $SecureDBPassword)

# Create the Integration Runtime
Write-Host "Creating your integration runtime."
Set-AzureRmDataFactoryV2IntegrationRuntime  -ResourceGroupName $ResourceGroupName `
                                            -DataFactoryName $DataFactoryName `
                                            -Name $IntegrationRuntimeName `
                                            -Type Managed `
                                            -CatalogServerEndpoint $CatalogServerEndpoint `
                                            -CatalogAdminCredential $ServerCreds `
                                            -CatalogPricingTier $CatalogPricingTier `
                                            -Description $IntegrationRuntimeDescription `
                                            -Location $Location `
                                            -NodeSize $IntegrationRuntimeNodeSize `
                                            -NodeCount $IntegrationRuntimeNodeCount `
                                            -MaxParallelExecutionsPerNode $IntegrationRuntimeParallelExecutions 


# Start the Integration Runtime (takes 20 to 30 minutes)
Write-Warning "Starting your integration runtime. This command takes 20 to 30 minutes to complete."
Start-AzureRmDataFactoryV2IntegrationRuntime -ResourceGroupName $ResourceGroupName `
                                             -DataFactoryName $DataFactoryName `
                                             -Name $IntegrationRuntimeName `
                                             -Force

Write-Host "Done"


Note: when you get the error message "There is no active worker agent" while executing an SSIS package then you probably need to start the Integration Runtime (see Start-AzureRmDataFactoryV2IntegrationRuntime).
There is no active worker agent.

Sunday 29 October 2017

Calculating Hash values in SSIS

Case
I want to calculate a hash value for a couple of columns in SSIS. In T-SQL you can use HASHBYTES, but that doesn't work for other sources like flat files and for SQL 2012/2014 the input is limited to only 8000 bytes. Is there an alternative for HASHBYTES?
Calculating a hash value over multiple columns















Solution
There are several alternatives for the T-SQL HASHBYTES. First of all there are various custom components available for SSIS like the SSIS Multiple Hash on codeplex, but if you don't want to (or cannot) use custom components, you can accomplish the same result with a little .NET scripting. If you really want to stick to T-SQL, then you can also first stage your files in a table and then calculate the hash with T-SQL afterwards. This blog will show you the scripting solution.

But first, why do you need a hash? When you want to keep track of history with a Persistent stage, Data Vault or Data Warehouse you want to know whether the record from the stage layer is different then the one you have in your historical data layer. You could check each column one by one, but when you have a whole bunch of columns that could be a lot of work and a bit slow.

A hash in ETL is used to generate a single, corresponding (but shorter) value for a whole bunch of columns. It is stored in the stage table as a new column. If one character changes in one of those columns then the hash value will also be different. When comparing the two records (one from the stage layer and one from the historical layer) you now only have to compare the hash value. If it did not change you know you don't have to process the record in your historical layer. Since you only want to calculate the hash once (in the stage package) you will also store it in the historical layer.

Now it is time to explain the scripting solution

1) Starting point
The starting point of this example is a Data Flow Task with a Flat File source component.
Flat File Source












2) Script Component - Input Columns
Add a new Script Component (transformation) to the Data Flow Task. Give it a suitable name and connect it to your flow. Then edit it and select all columns you want to hash on the Input Columns pane. Since we are not changing the existing column you can keep the default Usage Type 'ReadOnly'.
Script Component Input Columns

















Which columns do you want to hash? Three most chosen options:
  1. If you do not know the Primary Key: select all columns to calculate the hash.
  2. If you do know the Primary Key: select all columns except the Primary Key to calculate the hash.
  3. If the Primary Key consists of multiple columns you could even calculate a separate hash for the key columns only.
3) Script Component - Output Column
We need to store the calculated hash in a new column. Go to the Inputs and Outputs pane and add a new column in Output 0. The data type is string and the size depends on which hash algoritme you want to use. For this example we use the MD5 algoritme which returns a 128 bits hash. When you convert that to an ASCII string it would be a 32 character string (that only contains hexadecimal digits).
Script Component Inputs and Outputs























4) Script Component - The script preparation
Now we are almost ready to add the actual script. Go to the Script pane. Select your scripting language. This example will be in C#. Then hit the Edit Script... button to start the Vsta environment. This is a new instance of Visual Studio and will take a few moments to start.
Edit Script...























Optional:
I always start by removing all unnecessary methods and comments to keep the code clean. For this example we do not need the PreExecute and PostExecute methods and I do not want to keep the default help comments.
Clean up before start



















5) Script Component - The code
First we need to add two extra usings to shorten the code. Unfold the Namespaces region at the top and add the following usings:
using System.Security.Cryptography;
using System.Text;

Then Locate the method called Input0_ProcessInputRow and add a new GetMd5Hash method below this existing method (below its closing }). The new method is copied from this MSDN page. I only changed the encoding to Unicode (see note 1):
static string GetMd5Hash(MD5 md5Hash, string input)
{
    // Convert the input string to a byte array and compute the hash.
    byte[] data = md5Hash.ComputeHash(Encoding.Unicode.GetBytes(input));

    // Create a new Stringbuilder to collect the bytes
    // and create a string.
    StringBuilder sBuilder = new StringBuilder();

    // Loop through each byte of the hashed data
    // and format each one as a hexadecimal string.
    for (int i = 0; i < data.Length; i++)
    {
        sBuilder.Append(data[i].ToString("x2"));
    }

    // Return the hexadecimal string.
    return sBuilder.ToString();
}


And at last change the code of the existing method Input0_ProcessInputRow to:
public override void Input0_ProcessInputRow(Input0Buffer Row)
{
    string Separator = "|";
    string RowData = "";

    using (MD5 md5Hash = MD5.Create())
    {
        Row.Hash = GetMd5Hash(md5Hash, RowData = (Row.Title_IsNull ? "" : Row.Title.ToString()) + Separator + (Row.FirstName_IsNull ? "" : Row.FirstName.ToString()) + Separator + (Row.MiddleName_IsNull ? "" : Row.MiddleName.ToString()) + Separator + (Row.LastName_IsNull ? "" : Row.LastName.ToString()));
    }
}


The code above first concatenates all columns with a separator between them (see note 2) and it checks whether the value isn't NULL because we cannot add NULL to a string (see note 3). You will see that it repeats this piece of code for each column before calling the hash method:
(Row.Title_IsNull ? "" : Row.Title.ToString()) + Separator
For the first record in our example it will hash the following text: Mr.|Syed|E|Abbas
And for the third row that contains a null value it will hash the this text: Ms.|Kim||Abercrombie

6) Testing the code
After closing the Vsta editor and clicking OK in the Script Component to close it, add a dummy Derived Column behind it and add a Data Viewer to see the result.
Similar result to T-SQL HASHBYTES


















Note 1:
When you want the exact same result as with T-SQL HASHBYTES then you have to make sure you use the same encoding. Otherwise you get a different hash. In the method GetMd5Hash on the first line of code you see Encoding.Unicode.GetBytes(. There are more options besides Unicode. For example: ASCII, UTF7, UTF8, UTF32, etc. etc. However, as long as you don't have to compare hashes generated by to different methods (T-SQL and .Net) it doesn't matter. In this stackoverflow post you find more examples.

Note 2:
The column separator is added to prevent unwanted matches. If you have these two records with two columns:
Column1 Column2
123 456
12 3456
Without the separator these two will both get concatenated to 123456 and therefor generate the same hash. With the separator you will have two different values to hash: 123|456 and 12|3456. Choose your separator wisely. The number 3 would not be a wise choice in this case.

Note 3:
In the code you see that the columns are checked for null values because you cannot add null to a string. The null values are replace with an empty string. However this shows a bit of an imperfection of this method, because a string with a null value isn't the same as an empty string. To overcome this you could use a different string that is likely to occur in your text. For Numeric and Date data types you could just add an empty string, something like:
(Row.MyNumberColumn_IsNull ? "" : Row.MyNumberColumn.ToString()) + Separator
(Row.MyDateColumn_IsNull ? "" : Row.MyDateColumn.ToString()) + Separator


Note 4:
md5 only uses 128 bits and there are better, saver (, but also a bit slower) methods to calculate hashes:
SHA and SHA1 - 160 bits
SHA2_256 - 256 bits
SHA2_512 - 512 bits

Safer? As long as you don't use it to hash passwords you are still OK with md5.
Better? In rare cases two different strings could return the same md5 hash, but you have a higher chance to win the galaxy lottery.

Rather use SHA2_512? Just use this code instead:
public override void Input0_ProcessInputRow(Input0Buffer Row)
{
    string Separator = "|";
    string RowData = "";

    using (SHA512 shaHash = new SHA512Managed())
    {
        Row.hash2 = GetShaHash(shaHash, RowData = (Row.Title_IsNull ? "" : Row.Title.ToString()) + Separator + (Row.FirstName_IsNull ? "" : Row.FirstName.ToString()) + Separator + (Row.MiddleName_IsNull ? "" : Row.MiddleName.ToString()) + Separator + (Row.LastName_IsNull ? "" : Row.LastName.ToString()));
    }
}

static string GetShaHash(SHA512 shaHash, string input)
{
    // Convert the input string to a byte array and compute the hash.
    byte[] data = shaHash.ComputeHash(Encoding.Unicode.GetBytes(input));

    // Create a new Stringbuilder to collect the bytes
    // and create a string.
    StringBuilder sBuilder = new StringBuilder();

    // Loop through each byte of the hashed data
    // and format each one as a hexadecimal string.
    for (int i = 0; i < data.Length; i++)
    {
        sBuilder.Append(data[i].ToString("x2"));
    }

    // Return the hexadecimal string.
    return sBuilder.ToString();
}

Too much columns => too much coding?
In my book Extending SSIS with .NET Scripting you will find a script component example that just loops through all columns to generates the hash. No money to buy it? I used this code as the base for that script.
An other alternative is to generate the Script Component and its code with BIML. Here is an example of a Script Component in BIML, but getting the hash to work is a bit of a challenge but doable.

Tuesday 29 August 2017

Azure Data Lake Store in SSIS

Case
Microsoft just released a new Azure Feature Pack for SSIS with ADLS Connectivity.  What's new?
Azure Data Lake Store File System Task












Solution
It contains four new items and a new connection manager:
- Azure Data Lake Store File System Task
- Foreach ADLS File Enumerator
- Azure Data Lake Store Source
- Azure Data Lake Store Destination

Azure Data Lake Store File System Task
This task only allows you to upload or download files to the Azure Data Lake Store. This is similar to the Azure Blob Upload / Download Task. In the near future new operations will be added. A delete file or delete folder would be a handy addition

1) Start
First download and install the new Azure Feature Pack. Then check the Azure Folder in the SSIS Toolbox and drag the Azure Data Lake Store File System Task to the surface. Give it a suitable name.
Azure Data Lake Store File System Task






















2) Operation and source
Edit the new task and select an Operation. For this example I will use the CopyToADLS operation. Then we first need to specify where the files are located on the local machine. This is a hardcoded path but can be overwritten with an expression. The FileNamePattern is a wildcard with ? or *. I use *.csv to upload all csv files in that folder. SearchRecursively allows you to find files in subfolders.
Specify local source






















3) Destination - Connection manager
Next we need to create a new ADLS connection manager or select an existing one.
ADLS Connection Manager






















As host you can use the URL property from the ADLS Overview page. Go to the Azure Portal and copy that URL
URL = ADLS Host











For this example I will use the easier Authentication: Azure AD User Identity. It uses you email address and password from Azure. The Azure AD Service Identity will be handled in a later post.
ADLS Connection Manager





















When hitting OK or Test Connection it will open an Azure Login page, where you need to login and confirm that SSIS can connect to that ADLS.

4) Destination - ADLS folder
Next we need to specify a folder name or path. You can either specify the name of an existing folder or a new folder name that will be created when executed. To find which existing folders you have, you can use the Data Explorer page in ADLS.
Data Explorer










Specify Folder






















The FileExpiry option lets you specify the data that will be used the expire the files in ADLS. You can leave it empty to never expire.

5) The result
Now run the task/package to see the result. Use Data Explorer in ADLS to see the actual result.
Data Explorer












Foreach ADLS File Enumerator
The Foreach ADLS File Enumerator is a new enumerator for the Foreach Loop Container. It allows you to loop through an ADLS folder and return the paths of the files. It is very similar to the Azure Blob Enumerator. You can use this enumerator with the Azure Data Lake Store Source in the Data Flow Task.

1) Select Enumerator
When you select the ADLS File Enumerator. You need to specify the Connection Manager (see above, step 3 of task). The remote folder (use the Data Explorer to find an existing folder). And then the wildcard and the Search recursive option.
Collection






















2) Variable Mappings
In the Variable Mappings pane you need to map the first item of the collection (zero based) to an SSIS string variable.
Variable Mappings






















3) The Result
To show the content of the variable during execution, I added a simple Script Task and a little C# code: MessageBox.Show(Dts.Variables["User::filepath"].Value.ToString());
MessageBox.Show



























C) Azure Data Lake Store Source
This allows you to use files from the Azure Data Lake Store as a source in SSIS. Again very similar to the Azure Blob Source.

1) Edit Source
Drag the Azure Data Lake Store Source to the surface and give it a suitable name. Then edit the source and specify the connection manager, File Path and format. You cannot specify the data type or size. In this first test every thing became (DT_WSTR,100).
Azure Data Lake Store Source























2) The Result
To test the result (with a very small file) I added a dummy Derived Column and a Data Viewer.
Azure Data Lake Store Source

















D) Azure Data Lake Store Destination
This allows you to stream your Data Flow Task data to Azure Data Lake Store. Again very similar to the Azure Blob Destination.

1) Edit Destination
Add a Azure Data Lake Store Destination after your source or transformation and give it a suitable name. You can specify the connection manager, file path and the file format option.
ADLS destination - test3.csv

















2) The Result
To test the result run the package and open the Data Explorer in ADLS to see the result
Data Explorer











Conclusion
Much needed ADLS extension for the Azure Feature Pack, but nothing spectacular compared to the Blob Storage items in this feature pack. Hopefully the Azure Data Lake Store File System Task will soon be extended with new actions and perhaps they could also introduce the Azure Blob Storage File System Task.

Wednesday 23 August 2017

Where is the new SSIS project type?

Case
I just downloaded the new SSDT for Visual Studio 2017 (15.3.0 preview) and although I selected Integration Services during installation it isn't showing in the New Project window.
SSDT for Visual Studio 2017 (15.3.0 preview)























Solution
You probably already had Visual Studio 2017 installed and added the Analysis Services and Reporting Services project types via Extensions and Updates.


1) Remove projects
Go to Extensions and Updates and remove both the Analysis Services and Reporting Services projects.
Remove SSAS and SSRS projects



















After that close Visual Studio and let the VSIX installer, uninstall both projects.
VSIX (un)installer

















2) Uninstall SSDT and Install SSDT
Then first uninstall SSDT completely. And then install SSDT (Repair didn't work).
Uninstall and install SSDT






















3) Create new SSIS Project
Now open Visual Studio and create a new SSIS project.
New SSIS Project







Wednesday 26 July 2017

Active Directory as Source

Case
I want to use the users from my Active Directory in my Data Warehouse for an Employee Dimension. How can I accomplish that in SSIS?














Solution
With a little .Net scripting in a Script Component you can accomplish an Active Directory source in your Data Flow Task. Before you start you need to figure out which fields are available in your Active Directory. If you don't know the fieldnames you could set a breakpoint on the foreach loop and add a watch on the result variable. Then you could browse through the properties of that variable to find all available fields.

1) Script Component Source
Create a Data Flow Task and add a Script Component and choose "Source" as the type. After that give you source a suitable name like "SRC_SCR - Active Directory Users".
Script Component - Source





















2) Output columns
Next edit the Script Component and go to the Inputs and Outputs pane. This is where we need to specify all the new output columns and datatypes. For my Active Directory example I have added 5 string (DT_STR) columns: Fullname, Firstname, Surename, Department and Manager. You have to figure out the required length. You could start with the default 50 or change it to a more save 255.
Add new output columns

























3) Edit script
Go back to the first page and choose C# as your scripting language and then hit the Edit Script buton to open the vsta environment.
Edit script

























4) Reference
To tell our script about Active Directory, we first need to add a new reference to the .NET assembly System.DirectoryServices. In the Solution Explorer right click References and
 choose Add Reference... Then scroll down, select System.DirectoryServices and click OK to confirm.
Add reference

















5) The Script - Namespaces
Under namespaces add a new using for our newly referenced assembly: using System.DirectoryServices; This allows you to shorten the code.
Add using



















6) The Script - CreateNewOutputRows
Now replace your CreateNewOutputRows method with the one below and add the GetPropertyValue method from below within your class tags under the existing methods. You could remove the PostExecute and PreExecute methods since we don't need them in this example.

The CreateNewOutputRows method adds new rows to the buffer by calling Output0Buffer.AddRow(); within a loop construction and then it fills all columns with data from the Active Directory. Before the foreach loop you find the code to connect to the Active Directory with some optional filter.
public override void CreateNewOutputRows()
{
    // Specify the connnectionstring of your domain
    // @mycompany.com => LDAP://DC=mycompany,dc=com
    // Consider using a variable or parameter instead
    // of this hardcoded value. On the other hand
    // how many times does your domain changes
    string domainConnectionString = "LDAP://DC=ilionx,dc=com";

    using (DirectorySearcher ds = new DirectorySearcher(new DirectoryEntry(domainConnectionString)))
    {
        ds.Filter = "(&" +
                    "(objectClass=user)" +  // Only users and not groups
                    "(department=*)" +      // All departments
                    "(givenname=j*)" +      // Firstname starts with j
                    ")";
        // See ds. for more options like PageSize.
        //ds.PageSize = 1000;

        // Find all persons matching your filter
        using (SearchResultCollection results = ds.FindAll())
        {
            // Loop through all rows of the search results
            foreach (SearchResult result in results)
            {
                // Add a new row to the buffer
                Output0Buffer.AddRow();
                // Fill all columns with the value from the Active Directory
                Output0Buffer.Fullname = GetPropertyValue(result, "cn");
                Output0Buffer.Surename = GetPropertyValue(result, "sn");
                Output0Buffer.Department = GetPropertyValue(result, "department");
                Output0Buffer.Manager = GetPropertyValue(result, "manager");
                Output0Buffer.Firstname = GetPropertyValue(result, "givenname");
            }
        }
    }
}

// Extra method to avoid having an if construction around each column
// It checks whether it can find a value. If not it returns an empty string.
private static string GetPropertyValue(SearchResult Results, string Property)
{
    // Null value results in count zero
    if (Results.Properties[Property].Count > 0)
    {
        return Results.Properties[Property][0].ToString();
    }
    else
    {
        return "";
    }
}


7) The result
To test the result add a dummy Derived Column and a Data Viewer to see the values. If it works you can add the rest of the components to the Data Flow Task to fill your dimension table.
The result of my script
















Summary
A very simple and short script to get data from your Active Directory. For more filter examples visit
MSDN. For very large Active Directories you have to play with the PageSize property.
Note that there are also scriptless options like with an ADO.Net source or with Third Party components. Each solution has its own pros and cons.




Related Posts Plugin for WordPress, Blogger...