File Encryption And Decryption Using ADF

DESCRIPTION:

Files will be placed on the SFTP server. They should be staged in blob storage and from the blob, they will be loaded into the Azure SQL database. After loading into SQL DB, they should be archived and loaded as encrypted files in the Archive folder.

DATA FLOW:

SOLUTION:

Entire pipeline in ADF



STEP1: 

From the SFTP server files will be loaded into staging storage i.e., blob storage, and the files in it are as follows:


STEP2: 

As a next step, multiple files with the same structure in blob storage will be loaded into the Azure SQL DB table. The data in SQL DB is as follows:


STEP3: 

Using the “Get Metadata” activity, extract the list of files from blob storage which is staging storage - as ‘child items’.

STEP4: 

Using parallel processing in the “For Each” activity iterate through child items that were extracted from the above step.

STEP5: 

Using “web activity”, passing the filename as a parameter, Azure Function is called with URL concatenated with the filename as below.

URL:- @concat('https://fn-encrypt.azurewebsites.net/api/HttpTrigger1?File_name=',item().name)

METHOD: - GET

HEADERS: -


NAME

VALUE

x-functions-key

n78BD9RuKkRnYMGJdrNy3YwFh/GLfTXmWBg0PT6bl27ralh2Apoclg==

CODE USED FOR ENCRYPTION:

CODE USED FOR DECRYPTION:

STEP6: 

Once web activity encrypts data, use a copy activity to write web activity output into blob storage and its process is as follows:

6.1. Upload a text file (e.g., template.txt) with some random text like “my data” (sample data in an empty text file. Any data can be written into it).

6.2. In the “copy activity” source take Azure Blob Storage as a dataset and select the above uploaded “template.txt” file into it.

6.3. Create an additional column, give some name and assign “web activity” output to it as a value.

Provide Wildcard folder path and wildcard filename as “item().name”.

Additional Columns:

NAME

VALUE

MYDATA

@activity('Azure function for file encryption').output.Response 

6.4. In sink select Azure Blob Storage as a destination where encrypted data has to be written, and in mapping import schema and assign the additional column “MYDATA” to “Column1 which was generated randomly.

STEP7:

Publish the pipeline and “Debug” it. The encrypted files will be loaded into specified blob storage and the data is as follows:

Comments