Parameterizing Dataflows¶
The dataflows you create with Centerprise can be ported to any number of target environments that may use different connections to data providers, or other settings specific to the target environment. For example, you can create and test your dataflow in a staging environment, and then deploy it in your production environment. Prior to deploying to your target environment, you may need to update the dataflow’s data connections to make them appropriate for the target environment. The updating of data connections is made easy by using the Parameter Information Replacement dialog.
The dataflows you create with Centerprise can be ported to any number of target environments that may use different connections to data providers, or other settings specific to the target environment. For example, you can create and test your dataflow in a staging environment, and then deploy it in your production environment. Prior to deploying to your target environment, you may need to update the dataflow’s data connections to make them appropriate for the target environment. The updating of data connections is made easy by using the Parameter Information Replacement dialog.To open the Parameter Information Replacement dialog, click the icon on the Dataflow toolbar.
The left-hand pane of the Parameter Information Replacement dialog shows the parameter tree with all the objects present in the dataflow. Select an object in the tree, and update its connection in the right-hand pane’s input as necessary.
Objects in the tree can be grouped by category, such as Source, Transformations, etc, or left ungrouped. The grouping is done by means of the two icons in the top left of the dialog.
Note: In the case of a scheduled dataflow running on a server, a similar functionality is available in the Job Parameters tab of the scheduled job interface in the Scheduler. For more information on Scheduler, see Scheduling and Running Jobs on a Server.
Context Information¶
In addition to aligning your data connections with a target environment, you might also want to ensure that any context information is suitable for the environment in which your dataflow will run. For example, make sure that the server name variable points to the correct server instead of being hard-coded. This can be achieved by using Context Information parameters. Context Information parameters take their values dynamically at dataflow run time. These parameters include ServerName, JobId, UserName, and DroppedFilePath, among others.
To use context information parameters in your dataflow, drag and drop the Context Information object from the Resources group in the Flow toolbox onto the dataflow.
An example of what a Context Information object might look like is shown below.
The following parameters are available:
Name - name of the dataflow, for example dataflow110.df
JobId - ID of the job assigned by the server
ScheduledJobId - ID of the job associated with a schedule in case of scheduled running on a server
ServerName - name of the server running the dataflow
UserName - user account that runs the dataflow
DroppedFilePath - path to the file being dropped, in case of a scheduled job of type ‘when file is dropped’
Note: You can also assign default values to parameters. These values will take effect when no other value is available. To select a default value, open the properties of the Context Information object and enter the default value next to the parameter.
Parameters¶
You can also define any number of custom parameters. These custom parameters can be used throughout the dataflow. They can be initially assigned a default value and then their value can change in the course of running your dataflow.
Custom parameters can be configured in the Parameters object. To add a Parameters object, drag it from the Resources group in the Flow toolbox and drop it onto the dataflow.
An example of what a Parameters object might look like is shown below.
To add a new parameter, open the Parameters object properties and enter your new parameter in the grid. The parameter should be assigned a data type and it can be optionally assigned a default value.
You can also refer to most objects’ property values by using the $ (dollar sign) notation, such as $(object_name.field_name). This functionality is available in any area in the dataflow that accepts $ parameters.
Examples of dataflow areas that accept $ parameters are:
- SQL Query Source
- SQL Statement Destination
- SQL Map
- Parameter Information Replacement dialog
Below is an example using a $ parameter in an SQL Query Source object:
SELECT [dbo].[Customers].[CustomerID],
[dbo].[Customers].[CompanyName],
[dbo].[Customers].[ContactName],
[dbo].[Customers].[ContactTitle]
FROM [dbo].[Customers]
where CustomerId = ‘$SQLQuerySource1.CustomerID’