Intellipaat Back

Explore Courses Blog Tutorials Interview Questions
0 votes
3 views
in Azure by (17.6k points)

Using Azure Data Factory, is it possible to perform a scheduled 1:1 copy/clone of all rows from multiple (not all) tables residing in an Azure SQL DB to another Azure SQL DB (in this case an Azure SQL Data Warehouse), without having to explicitly define all the table schema/structure in the Data Factory JSON datasets?

The reason I'm asking is that the Data Factory tutorials I've seen so far (such as this one) have explicitly defined table schemas for their copy activities. If we've got hundreds of tables that we want to copy, we would have to spend days writing table definitions in JSON (or script it, I guess, either case it seems like an unnecessary work effort for simply copying data).

(I should add that also the On-Premise SQL -> Azure DB/DW case with ADF is of interest. I would really like to be able to periodically and affordably move data to the cloud without having to specify thousands of table schemas. ).

1 Answer

0 votes
by (47.2k points)
  • Schema or structured definition is not required by Azure Data Factory but the data sets need to be defined for all the tables that are to be copied. You'll have to define the data set for all tables that require to be copied.

  • Steps for copying from one Azure SQL database to another Azure SQL database

    • Create a new data factory

    • After deployment of data factory, click on Author and monitor

    • Now select copy data

    • Mention the source data store as  Azure SQL database and import the database which you want to copy, then create a linked service for it.

    • Mention the destination datastore as the Azure SQL database and then create a linked service for it.

    • Finally, run the pipeline to copy data from one Azure SQL Database to another.

31k questions

32.8k answers

501 comments

693 users

Browse Categories

...