You have the ability to add custom SQL scripts to execute with your Trifacta Jobs. You can add one or more multi-statement SQL scripts to an output object, that queries a supported SQL-based connection you have access to.
As part of job execution, you can define SQL scripts to run before the job, after the job, or both. These SQL scripts are stored as part of the output object definition and can be executed against any database connection to which the user has access. SQL scripts can be added to any job execution, regardless of the original data source.
This feature may need to be enabled in your environment.
A workspace administrator can enable the use of SQL scripts. For more information, see Workspace Settings Page.
Example Usage Scenarios
Insert or update log entries in a database log table before or after a job that publishes to file or database destinations.
Perform custom inserts, updates, and delete logic to other database tables based on job output data that is published to a database.
Create and refresh tables or materialized views that join the job’s output data with data from other tables using
CREATE AS SELECT.
Operational tasks such as disabling/enabling indexes and managing partitions on supported databases.
Pre-job: After a job has been initiated and before data is ingested into the platform, a SQL script can be executed by the Trifacta application.
Post-job: After job results have been generated and published, a SQL script can be executed.
Create Output SQL Script
In Flow View, select the output object for which you wish to create the SQL script. In the Outputs panel on the right, click the Destinations tab.
You can modify one of its manual destinations or a scheduled destination by clicking on Edit.
2. In the SQL Scripts panel at the bottom of the screen, click Add new SQL Script
3. You will now see the Add SQL script dialog where you can select the target SQL connection, enter your SQL script and select when to run the script (before ingest or after the run.)
You can Validate the SQL and then Add it to your job settings.
If a pre/post step fails, then all downstream phases of the job also fail.
It is also possible to parameterize SQL scripts using variables (strings) or DateTime system parameters that can be overridden at runtime.
When flows are shared, the shared user can modify SQL Scripts if the user has Editor permissions on the flow.
For more details of this feature and some example SQL scripts, check out this detailed documentation guide.