Creating something that can be reused and will save you time is crucial not only for the developers but for the businesses too. Using SnapLogic you can express your needs and creativity in custom developed snaps which is also a great chance to look behind the scene and understand the core concepts of the platform. If you are interested in snap development and you want to expand your knowledge in this field besides the official SnapLogic documentation, feel free to visit our two previous posts for the sleep and flow reset snaps.
My experience in data integration, especially in databases was the main initiator for me to develop a custom snap called Generic JDBC â€“ Table Operation (Figure 1). In this post I will demonstrate you the purpose of this snap using two scenarios. In short terms, this snap has two main purposes: creating a copy of some existing table and moving the table completely to another place specified in the snap properties.
Figure 1 Generic JDBC – Table Operation
Taking a closer look on the Generic JDBC â€“ Table Operation snap functionalities
The development steps included in this snap are similar with those used in the flow resetÂ and sleep snaps. My main class is implementing â€śSnapâ€ť interface and using the â€śdefinePropertiesâ€ť method where Iâ€™m defining the snap properties: Source catalog, schema, table, Target catalog, schema, table, Operation, Database and Create table if not present.
In the â€śexecuteâ€ť method resides the logic for determining which (Copy or Move) operation is chosen. If Copy needs to be executed then processExecute method will be called, otherwise if Move is selected then processDropTable will be called after executing processExecute method.
Figure 2 Execute method
For simplicity, I have created a few more methods: processCreateTable, processCopyTableData, getCreateTableStatementString, getCopyStatementString, getDropStatementString â€¦
The methods getCreateTableStatementString, getCreateMSSQLTableStatementString, getCopyStatementString, getDropStatementString are returning a string that will be executed as a prepared statement in processCreateTable, processCopyTableData and processDropTable methods.
Here is an example of the methods used to build and return SQL statements:
Figure 3 Building SQL statements
The following code snippet (Figure 4) describes how the SQL statement for copying table data is executed:
Figure 4 Execute prepared statement
One important thing before building the jar and zip file is that you need to include your account class in the pom.xml configuration, Figure 5 below.
Figure 5 Including the account class in the pom file
Once the snap is developed and uploaded on the desired snaplex you should be able to find it under your snap pack name.
Figure 6 Generic JDBC – Table Operation under table Operation Snap pack
In order to use this snap you should create your JDBC account by populating the field presented in the Figure 7 below in the Manager or directly in the Account tab of the snap.
Figure 7 Setting up the JDBC account
Before moving to the main configuration window of the snap, for better understanding, letâ€™s take a look at the database I set up for the purposes of this demonstration. We will use employee table under theÂ resources schema in HumanResource database. This table consists of eight columns and is populated with a few sample records (Figure 8).
Figure 8 employee table
Scenario 1: Operation â€śCopyâ€ť
While working with databases sometimes there is a need for creating a backup copy of a particular table. Traditionally this can be done by executing an SQL statement or a set of SQL statements or you can create a backup using some options provided by the tool used to interact with the database. Using this custom snap I can create a copy of a table only by setting up a few snap settings which will save you time compared with the traditional way of creating a copy. Also there is a possibility to use suggestions that will guide you to easily find the desired source or target schema and table. One of the advantages is also using the expression capabilities of SnapLogic for dynamically composing the value of the fields so that way you can easily integrate this snap in big integration flows.
For demonstrating the copy functionality Iâ€™m using the already created and populated employee table, and I need to create a copy of that table using the following naming convention: employee_backup_YYYYMMDD.
Figure 9 Create a backup table
As we can see in Figure 9, we have to choose source schema and table, target schema and table, operation (Copy or Move), database. If you donâ€™t have the target table and you want it to be created automatically by the snap you can use â€śCreate table if not presentâ€ť. I also choose â€śExecute during previewâ€ť,
Figure 10 employee_backup_20190603 table
and after validating, from Figure 10 we can see that new table â€śemployee_backup_20190603â€ť is automatically created and populated with the values from the source table â€śemployeeâ€ť (Figure 11).
Figure 11 Data in the backup table
Scenario 2: Operation â€śMoveâ€ť
Similar to the above scenario, letâ€™s say we need to move the â€śemployeeâ€ť table from â€śresourcesâ€ť schema to a new â€śHRâ€ť schema but we alsoÂ need to name the new table as â€śemployee_newâ€ť.
Figure 12 Move operation
If the execution of the snap is completed successfully then a status message Success is received (Figure 13), otherwise the snap will return the error of the execution.
Figure 13 Output of the execution
After successfully executing the above configuration we can see (Figure 14) that the old table â€śresources.employeeâ€ť is moved to â€śHR.employee_newâ€ť table.
From a technical point of view this snap can save you time because you can easily copy or move tables without knowing any SQL command. The other key thing is that there are some differences between the SQL syntax among different databases so you donâ€™t need to worry about that. This is just an initial version of this snap and of course can be updated and optimized but the most important thing is that snap development is something that offers joy and it makes the platform fully applicable for any integration.