DDL Icon DDL

Description

This transform generates DDL using column metadata taken from input rows.

Supported Engines

Hop Engine

Supported

Spark

Supported

Flink

Supported

Dataflow

Supported

Options

Option

Description

Transform name

Name of the transform. This name has to be unique in a single pipeline.

Connection name

The name of the database connection to use for the SQL dialect and execution.

Table schema

The database schema for the table.

Table name

The name of the table to generate the DDL for.

Column field name

The name of the input field to contain the table column name.

Column type name

The name of the input field to contain the Apache Hop type for the column name. It will get translated into the correct type for the chosen database.

Column length name

The name of the input field to contain the column length.

Column precision name

The name of the input field to contain the column precision.

Execute DDL?

Execute the generated DDL?

DDL output field (optional)

Specify the name of the field to contain the generated SQL in the output.

Load all data from table

Pre-loads the cache with all the data present in the lookup table.

Drop table before creation?

Check this option if you want to always re-create the specified table. The table is NOT dropped if the DDL is not executed.

IMPORTANT: USE WITH CAUTION!

Example

We have 2 input rows for the DDL transform:

name

hopType

length

precision

id

Integer

9

0

name

String

100

-1

For a standard H2 database we can generate the DDL for table ddlTest. At runtime it will generate

CREATE TABLE ddlTest
(
  id INT
, name VARCHAR(100)
)
;

Hop Metadata

If you want to re-use field level metadata you can consider using the Metadata Input transform. For example, you can get access to the JSON of Beam File Definition or Static Schema Definition elements. You can then use the JSON Input transform to parse the metadata and extract the 4 fields you need.