
Creating variables and parameters
In this recipe, we will extend the functionality of our Hello World dataflow (see the Understanding the Designer tool recipe from will generate the second row, providing you with the name of the Data Services job that generated the greetings.
This example will not just allow us to get familiar with how variables and parameters are created but also introduce us to one of the Data Services functions.
Getting ready
Launch your Designer tool and open the Job_HelloWorld
job created in the previous chapter.
How to do it…
We will parameterize our dataflow so that it can receive the external value of the job name where it is being executed, and create the second row accordingly.
We will also require an extra object in our job, in the form of a script that will be executed before the dataflow and that will initialize our variables before passing their values to the dataflow parameters.
- Using the script button (
) from the right instrument panel, create a script object. Name it
scr_init
, and place it to the left of your dataflow. Do not forget to link them, as shown in the following screenshot: - To create dataflow parameters, click on the dataflow object to open it in the main workspace window.
- Open the Variables and Parameters panel. All panels in Designer can be enabled/displayed with help the of the buttons located in the top instrument panel, as in the following screenshot:
- If they are not displayed on your screen, click on the Variables button on the top instrument panel (
). Then, right-click on Parameters and choose Insert from the context menu. Specify the following values for the new input parameter:
Note
Note that the
$
sign is very important when you reference a variable or parameter, as it defines the parameter in Data Services and is required so that the compiler can parse it correctly. Otherwise, it will be interpreted by Data Services as a text string. Data Services automatically puts the dollar sign in when you create a new variable or parameter from the panel menus. However, you should not forget to use it when you are referencing the parameter or variable in your script or in the Calls section of the dataflow. - Now, let's create a job variable that we will use to pass the value defined in the script to the dataflow parameter. For this, use the Back (Alt + Left) button to go to the job level (so that its content is displayed in the main design window). Then, right-click on Variables in the Variables and Parameters panel and choose Insert from the context menu to insert a new variable. Name it
$l_JobName
and assign thevarchar(100)
data type to it, which is the same as the dataflow parameter created earlier. - To pass variable values from the job to the input parameter of the dataflow, go to the Calls tab of the Variables and Parameters panel on the job design level. Here, you should see the input dataflow
$p_JobName
parameter with an empty value. - Double-click on the
$p_JobName
parameter and reference the$l_JobName
variable in the Value field of the Parameter Value window. Click on OK: - Assign a value to a job variable in the previously created script object. To do this, open the script in the main design window and insert the following code in it:
$l_JobName = 'Job_HelloWorld';
- Finally, let's modify the dataflow to generate a new column in the target table. For this, open the dataflow in the main design window.
- Open the Query transform and right-click on the
TEXT
column to go to New Output Column… | Insert Below. - In the opened Column Properties window, specify
JOB_NAME
as the name of the new column and assign it the same data type,varchar(100)
. - In the Mapping tab of the Query transform for the
JOB_NAME
column, specify the'Created by '||$p_JobName
string. - Go back to the job context and create a new global variable,
$g_JobName
, by right-clicking on the Global Variables section and selecting Insert from the context menu. - Your final Query output should look like this:
- Now, go back to the job level and execute it. You will be asked to save your work and choose the execution parameters. At this point, we are not interested in modifying them, so just continue with the default ones.
- After executing the job in Designer, go to Management Studio and query the
HELLO_WORLD
table to see that a new column has appeared with the'Created by Job_HelloWorld'
value.
How it works…
All main objects in Data Services (dataflow, workflow, and job) can have local variables or parameters defined. the difference between an object variable and an object parameter is very subtle. Parameters are created and used to accept the values from other objects (input parameters) or pass them outside of the object (output parameters). Otherwise, parameters can behave in the same way as local variables—you can use them in the local functions or use them to store and pass the values to other variables or parameters. Dataflow objects can only have parameters defined but not local variables. See the following screenshot of the earlier example:

Workflow and job objects, on the other hand, can only have local variables defined but not parameters. Local variables are used to store the values locally within the object to perform various operations on them. As you have seen, they can be passed to the objects that are "calling" for them (go to Variables and Parameters | Calls).
There is another type of variable called a global variable. These variables are defined at the job level and shared among all objects that were placed in the job structure.
What you have done in this chapter is a common practice in Data Services ETL development: passing variable values from the parent object (job in our example) to the child object (dataflow) parameters.
To keep things simple, you can specify hard-coded values for the input dataflow parameters, but this is usually considered bad practice.
What we could also do in our example is pass global variable values to dataflow parameters. Global variables are created at a very top job level and are shared by all nested objects, not just with immediate job child objects. That is why they are called global. They can be created only in the job context, as shown here:

Also, note that in Data Services, you cannot reference parent object variables directly into child objects. You always have to create input child object parameters and map them on the parent level (using the Calls tab of the Variables and Parameters panel) to local parent variables. Only after doing this, you can go in your child object and map its parameters to the local child object's variables.
Now, you can see that parameters are not the same thing as variables, and they carry an extra function of bridging variable scope between parent and child. In fact, you do not have to map them to a local variable inside a child object if you are not going to modify them. You can use parameters directly in your calculations/column mapping.
Last thing to say here is that dataflows do not have local variables at all. They can only accept values from the parents and use them in function calls/column mapping. That is because you do not write scripts inside a dataflow object. Scripts are only created at the job or workflow level or inside the custom functions that have their own variable scope.
Data types available in Data Services are similar to common programming language data types. For a more detailed description, reference the official Data Services documentation.
Note
The blob and long data types can only be used by structures created inside a dataflow or, in other words, columns. You cannot create script variables and dataflow / workflow parameters of blob or long data types.
There's more…
Try to modify your Job_HelloWorld
job to pass global variable values to dataflow parameters directly. To do this, use the previously created global variable $g_JobName
, specify a hard-coded value for it (or assign it a value inside a script, as we did with the local variable) and map it to the input dataflow parameter on the Calls tab of the Variables and Parameters panel in the job context. Do not forget to run the job and see the result.