Back
It is likely certainly. Simply introduce a second operation with the same Input dataset in the same pipeline with a different output dataset.
Instead, the JSON will look like this:
{ "$schema": "http://datafactories.schema.management.azure.com/schemas/2015-09-01/Microsoft.DataFactory.Pipeline.json", "name": "CopyActivity1", "properties": { "description": "Copy data from blob to a sql server table", "activities": [ { "name": "CopyActivityTemplate", "type": "Copy", "inputs": [ { "name": "AzureBlobLocation1" } ], "outputs": [ { "name": "AzureSqlTableLocation1" } ], "typeProperties": { "source": { "type": "BlobSource" }, }, { "name": "CopyActivityTemplate2", "type": "Copy", "inputs": [ { "name": "AzureBlobLocation1" } ], "outputs": [ { "name": "AzureSqlTableLocation2" } ], "typeProperties": { "source": { "type": "BlobSource" }, }, } ], "start": "2016-12-05T22:00:00Z", "end": "2016-12-06T01:00:00Z" }}
You can enroll for Azure Data Factory Certification from Intellipaat
1.2k questions
2.7k answers
501 comments
693 users