Intellipaat Back

Explore Courses Blog Tutorials Interview Questions
0 votes
3 views
in Azure by (17.6k points)

Source database: PostgreSQL hosted on Azure VM D16s_v3 Destination database: SQL Server developer edition hosted on Azure VM D4s_v3 Source database is around 1TB in size Destination database is empty with existing schema identical to the source database

Throughput is only 1mb/s. Nothing helps. (I've selected max DIU) SQL Server doesn't have any keys or indexes at this point.

Batch size is 10000

See screenshot: 

1 Answer

0 votes
by (47.2k points)
  • The same task which is done by Azure Data Factory can also be done by SSIS which is 5 times faster. 

  • Sometimes we are forced to go ahead and have custom applications that deal with all these processes individually which is time-consuming and integrating all these sources is a huge pain. we need to figure out a way to automate this process or create proper workflows.

  • Data factory helps to orchestrate this complete process into more manageable or organizable manner.

  • The Azure Data Factory runtime decimal type has a maximum precision of 28. If a decimal/numeric value from the source has higher precision, Data Factory will first cast it to a string. The performance of the string casting code is extremely bad.

  • For Copy activity performance and tuning guide you can refer to this documentation:

https://docs.microsoft.com/en-us/azure/data-factory/copy-activity-performance

31k questions

32.8k answers

501 comments

693 users

Browse Categories

...