Back

Explore Courses Blog Tutorials Interview Questions
0 votes
4 views
in Azure by (17.6k points)
recategorized by
Can I have a copy activity with two SINKs in Azure Data Factory pipeline?
I have one source and two sinks (one for downstream processing Azure Data lake store and the other for Blob Storage archives).

1 Answer

0 votes
by (47.2k points)
edited by

It is likely certainly. Simply introduce a second operation with the same Input dataset in the same pipeline with a different output dataset. 

Instead, the JSON will look like this:

{
    "$schema": "http://datafactories.schema.management.azure.com/schemas/2015-09-01/Microsoft.DataFactory.Pipeline.json",
    "name": "CopyActivity1",
    "properties": {
        "description": "Copy data from blob to a sql server table",
      "activities": [
        {
          "name": "CopyActivityTemplate",
          "type": "Copy",
          "inputs": [
            {
              "name": "AzureBlobLocation1"
            }
          ],
          "outputs": [
            {
              "name": "AzureSqlTableLocation1"
            }
          ],
          "typeProperties": {
            "source": {
              "type": "BlobSource"
          },
        },
        {
          "name": "CopyActivityTemplate2",
          "type": "Copy",
          "inputs": [
            {
              "name": "AzureBlobLocation1"
            }
          ],
          "outputs": [
            {
              "name": "AzureSqlTableLocation2"
            }
          ],
          "typeProperties": {
            "source": {
              "type": "BlobSource"
            },          
          },
        }
      ],
        "start": "2016-12-05T22:00:00Z",
        "end": "2016-12-06T01:00:00Z"
    }
}

You can enroll for Azure Data Factory Certification from Intellipaat 

Related questions

Browse Categories

...