Intellipaat Back

Explore Courses Blog Tutorials Interview Questions
0 votes
2 views
in Azure by (17.6k points)

I'd like to set up an Azure Data Factory pipeline which performs a move (i.e. copy, verify, delete) operation rather than just a copy operation between Blob Storage and a Data Lake Store. I cannot seem to find any detail on how to do this.

1 Answer

0 votes
by (47.2k points)
  • Azure Data Factory does not have a built-in option for moving files but we can make use of copy operation instead.

  • Create a new data factory

  • After deployment of data factory, click on Author and monitor

  • Now select copy data

  • Mention the source data store as the blob storage and import the database which you want to copy, then create a linked service for it.

  • Mention the destination datastore as the Data Lake Store and then create a linked service for it.

  • Finally, run the pipeline to copy data from one Azure SQL Data Warehouse to another Azure SQL Data Warehouse.

 

While creating the linked service for Data lake store, you are required to enter the service principal id and key. These are the following steps to get those credentials:

 

  • Go to Azure dashboard

  • Click on Azure active directory

  • Click on App registrations

  • Create a new web application, and the application ID which gets generated is the service principal ID

  • Go to Certificates and secrets, create a new client secret, and then a password is generated which is the service principal key

  • Go to Azure Data Lake Store and give all access to the application which is created for generating the service principal key

31k questions

32.8k answers

501 comments

693 users

Browse Categories

...