The Workflow action executes a previously defined workflow as a sub-worflow of the current one.

For ease of use, it is also possible to create a new workflow within the dialog, pressing the New Workflow button.

Use the Workflow action to execute a previously defined workflow.

This allows you to perform "functional decomposition." That is, you use them to break out workflows into more manageable units.

For example, you should not write a data warehouse load using one workflow that contains 500 actions.

It is better to create smaller workflows and compose them.

See also:


Main workflow options

Option Description

Action name

Name of the action.

Workflow Filename

Specify the XML file name of the workflow to execute. Click to browse through your local files.

Run configuration

The workflow run configuration to use for this workflow action.

Options Tab

Option Description Default

Execute for every input row?

Implements looping; if the previous workflow action returns a set of result rows, the workflow executes once for every row found. One row is passed to the workflow at every execution. For example, you can execute a workflow for each file found in a directory.


Wait for the remote workflow to finish?

Enable to block until the workflow on the Hop Server has finished


Logging Settings Tab

By default, if you do not set logging, Hop will take log actions that are being generated and create a log record inside the workflow.

For example, suppose a workflow has three pipelines to run and you have not set logging. The pipelines will not output logging information to other files, locations, or special configuration.

In this instance, the workflow executes and puts logging information into its master workflow log.

In most instances, it is acceptable for logging information to be available in the workflow log.
For example, if you have load dimensions, you want logs for your load dimension runs to display in the workflow logs. If there are errors in the pipelines, they will be displayed in the workflow logs. If, however, you want all your log information kept in one place, you must set up logging.

Option Description Default

Specify logfile?

Enable to specify a separate logging file for the execution of this workflow


Name of logfile

The directory and base name of the log file; for example C:\logs


Extension of logfile

The file name extension; for example, log or txt



Specifies the logging level for the execution of the workflow.


Append logfile?

Enable to append to the logfile as opposed to creating a new one


Create parent folder

Create the parent folder for the log file if it does not exist


Include date in logfile?

Adds the system date to the filename with format YYYYMMDD (eg 20051231).


Include time in logfile?

Adds the system time to the filename with format HHMMSS (eg 235959). See also the logging window in Logging


Parameters Tab

Specify which parameters will be passed to the sub-workflow:

Option Description Default

Copy results to parameter

results from previous workflows or pipelines are passed down to this workflows action’s parameters


Pass parameter values down to the sub-workflow

Enable this option to pass all parameters of the workflow down to the sub-workflow.



Specify the parameter names that will be passed to the workflow.


Stream column name

Allows you to capture fields of incoming records of a result set as a parameter.



Allows you to specify the values for the sub-workflow’s parameters. You can do this by:

  • Manually typing some text (Ex: ETL workflow)

  • Using a parameter to set the value (Ex: ${Internal.workflow.Name})

  • Using a combination of manually specified text and parameter values (Ex: ${FILE_PREFIX}_${FILE_DATE}.txt)