Back

Explore Courses Blog Tutorials Interview Questions
0 votes
2 views
in Azure by (17.6k points)

I am getting the following error while running a USQL Activity in the pipeline in ADF:

Error in Activity:

{"errorId":"E_CSC_USER_SYNTAXERROR","severity":"Error","component":"CSC",

    "source":"USER","message":"syntax error.

     Final statement did not end with a semicolon","details":"at token 'txt', line 3\r\nnear the ###:\r\n**************\r\nDECLARE @in string = \"/demo/SearchLog.txt\";\nDECLARE @out string = \"/scripts/Result.txt\";\nSearchLogProcessing.txt ### \n",

    "description":"Invalid syntax found in the script.",

    "resolution":"Correct the script syntax, using expected token(s) as a guide.","helpLink":"","filePath":"","lineNumber":3,

    "startOffset":109,"endOffset":112}].

Here is the code of output dataset, pipeline and USQL script which i am trying to execute in pipeline.

OutputDataset:

{

"name": "OutputDataLakeTable",

"properties": {

    "published": false,

    "type": "AzureDataLakeStore",

    "linkedServiceName": "LinkedServiceDestination",

    "typeProperties": {

        "folderPath": "scripts/"

    },

    "availability": {

        "frequency": "Hour",

        "interval": 1

    }

Pipeline:

{

"name": "OutputDataLakeTable",

"properties": {

    "published": false,

    "type": "AzureDataLakeStore",

    "linkedServiceName": "LinkedServiceDestination",

    "typeProperties": {

        "folderPath": "scripts/"

    },

    "availability": {

        "frequency": "Hour",

        "interval": 1

    }

}

Pipeline:

{

    "name": "ComputeEventsByRegionPipeline",

    "properties": {

        "description": "This is a pipeline to compute events for en-gb locale and date less than 2012/02/19.",

        "activities": [

            {

                "type": "DataLakeAnalyticsU-SQL",

                "typeProperties": {

                    "script": "SearchLogProcessing.txt",

                    "scriptPath": "scripts\\",

                    "degreeOfParallelism": 3,

                    "priority": 100,

                    "parameters": {

                        "in": "/demo/SearchLog.txt",

                        "out": "/scripts/Result.txt"

                    }

                },

                "inputs": [

                    {

                        "name": "InputDataLakeTable"

                    }

                ],

                "outputs": [

                    {

                        "name": "OutputDataLakeTable"

                    }

                ],

                "policy": {

                    "timeout": "06:00:00",

                    "concurrency": 1,

                    "executionPriorityOrder": "NewestFirst",

                    "retry": 1

                },

                "scheduler": {

                    "frequency": "Minute",

                    "interval": 15

                },

                "name": "CopybyU-SQL",

                "linkedServiceName": "AzureDataLakeAnalyticsLinkedService"

            }

        ],

        "start": "2017-01-03T12:01:05.53Z",

        "end": "2017-01-03T13:01:05.53Z",

        "isPaused": false,

        "hubName": "denojaidbfactory_hub",

        "pipelineMode": "Scheduled"

    }

Here is my USQL Script which i am trying to execute using "DataLakeAnalyticsU-SQL" Activity Type.

@searchlog =

    EXTRACT UserId          int,

            Start           DateTime,

            Region          string,

            Query           string,

            Duration        int?,

            Urls            string,

            ClickedUrls     string

    FROM @in

    USING Extractors.Text(delimiter:'|');

@rs1 =

    SELECT Start, Region, Duration

    FROM @searchlog

WHERE Region == "kota";

OUTPUT @rs1   

    TO @out

      USING Outputters.Text(delimiter:'|'); 

Please suggest me how to resolve this issue. 

1 Answer

0 votes
by (47.2k points)
  • There is an attribute missing in the script. i.e scriptLinkedService 

  • we need to place U-SQL script in Azure Blob Storage to run it successfully and we also need to create a linked service for the Azure Storage.

  • Eg:

{

    "name": "StorageLinkedService",

    "properties": {

        "description": "",

        "type": "AzureStorage",

        "typeProperties": {

            "connectionString": "DefaultEndpointsProtocol=https;AccountName=myAzureBlobStorageAccount;AccountKey=**********"

        }

    }

}

  • We need to create a linked service, replacing the blob storage name myAzureBlobStorageAccount with relevant Blob Storage account and then place the U-SQL script which is  “SearchLogProcessing.txt” in the container.

  • In this pipeline example below, we have a container called adlascripts in our Blob store and script is here 

{

    "name": "ComputeEventsByRegionPipeline",

    "properties": {

        "description": "This is a pipeline to compute events for en-gb locale and date less than 2012/02/19.",

        "activities": [

            {

                "type": "DataLakeAnalyticsU-SQL",

                "typeProperties": {

                    "scriptPath": "adlascripts\\SearchLogProcessing.txt",

                    "scriptLinkedService": "StorageLinkedService",

                    "degreeOfParallelism": 3,

                    "priority": 100,

                    "parameters": {

                        "in": "/input/SearchLog.tsv",

                        "out": "/output/Result.tsv"

                    }

                },

  •  Input and output files (.tsv) can be in the data lake and can make use of AzureDataLakeStoreLinkedService.

Browse Categories

...