Contact Us
Download N3uron
Back to videos

SQL Client / Inserting Data from a Table in SQL Client

Inserting Data from a Table in SQL Client

Description

In this video of our N3uron Academy, we’ll explore how to insert data from a table using the N3uron SQL Client module. Let’s get started!

  • [10:16] Inserting Data from a Table in SQL Client

Transcription

[00:00] Hello everyone! Welcome back. In the previous video, we showed how to configure the SQL Client module to connect to a PostgreSQL database. In this video, we’ll focus only on executing an INSERT query using a Standard SQL transaction. In upcoming videos, we’ll cover additional query types, such as SELECT, UPDATE, DELETE, and working with historical transactions. Before we begin with the transaction configuration, let’s review our tag model. Here, we have a set of metrics from three wind turbines. These are the values we want to insert into our database at regular intervals. Our PostgreSQL database contains a table named turbine_measurements If we run a SELECT query, we can see that the table is currently empty.

[01:00] The table includes columns for timestamp, turbine ID, metric name, numerical value, string value, boolean value, and quality.
Now, let’s go back to N3uron. Navigate to Config, then Modules, select the SQL Client module, and choose the channel for the database connector we configured in the previous video. Here, we can see that the connector is configured for a PostgreSQL database, where we’ve set the host IP address, port, username, password, and other connection details. Now, let’s focus on the Transaction settings. As you can see, we already have a standard transaction created for the INSERT query, which we’ll review later. For now, click the three-dot icon to add a new transaction. There are two types of transactions available. The Standard Transaction is typically used for SELECT, UPDATE, DELETE, or snapshot INSERT operations when triggered.
The History Transaction is used to insert historical tag data, buffering events over time before writing them to the database.

[02:00] In this video, we’ll focus on a Standard Transaction for an INSERT. You can also add a new transaction by right-clicking on a channel within the Model section. Now, let’s go ahead and create a new Standard Transaction for our INSERT query. Within the Standard Transaction, you’ll see two main sections: one for the Query Script and another for the Parse Script. Let’s start by looking at the Query Script, where we can see a script template already provided. You can use this template as is, or adapt it to fit your use case. Now, let’s open the Parse Script, where a template is also provided. You can use it as is, or adapt it as you see fit. Next, I’d like to explain how the data flow works between the Query Script, the database, and the Parse Script. For this, we’ll refer to N3uron’s Knowledge Base, which provides a helpful diagram for a Standard Transaction that illustrates this flow. As shown in this diagram, the Query Script generates a SQL query as a string, which is sent to the database as $.output.

[03:00] If the query returns results, such as in a SELECT query, they flow into the Parse Script as $.input. The Parse Script can then process the data, create an array of tag data objects, and pass it to $.output to update the tag model in N3uron for use by any other driver. Since we’re running an INSERT, no data is returned, so the Parse Script isn’t used in this case. Now, let’s go over the rest of the parameters, starting with Enable Data Collection. This option is used to activate or deactivate the transaction. When disabled, the transaction won’t run, even if triggered. Next, we have the Periodic Trigger, which enables the transaction to be executed automatically at a specified time interval. The Periodic Trigger offers three options. With Fixed Time, the transaction runs after a set amount of time has passed since the previous execution, starting from when the module starts. Fixed Interval runs the transaction at specific intervals based on the scan rate, regardless of when the module started.

[04:00] And with Cron, you can define a custom schedule using a cron expression. For any parameter, you can always check the quick help at the bottom of the screen for more details. Next is the Scan Rate, which controls how often the transaction runs when using Fixed Time or Fixed Interval. The Rescheduler Timer, used with Fixed Time, will reset the countdown if the transaction is triggered by another trigger in the meantime. Next is Run on Start. When this is enabled, the transaction will run automatically as soon as the module starts. For Fixed Interval, only the Scan Rate and Run on Start parameters are available. Now let’s look at the Tag Condition Trigger. This trigger runs the transaction when a specified tag condition is met. You define the tag, the property to monitor, and the condition to check. Options also let you control when it triggers and whether the tag resets after execution. Next, we have Parameters settings, which lets transactions access tag values, quality, and timestamps. You can define Single Tag, Tag Group, or Multiple Tags (Filter) parameters.

[05:00] Single Tag returns a single tag data object, while Tag Group and Multiple Tags (Filter) return an array of tag data objects. In Filter mode, you can add as many filters as needed, with at least one Include filter, and use regular expressions for more precise selection. Next are the Auto-demotion settings. These temporarily set a transaction to off-scan if the database isn’t responding, helping optimize communication. You can configure how many failures trigger it, and how long the transaction stays off-scan before retrying. Now that we’ve reviewed the Standard Transaction settings, we can delete this new transaction and take a look at the INSERT transaction we have already created. In this case, we’ve created a Tag Condition Trigger and provided the path to a tag named Trigger. This tag is a boolean, and we’re using its Value property to check if it’s true. When the tag becomes true, the transaction is triggered, and its value is then reset to false.

[06:00] Here, we’re using 1 as the true value, which works perfectly for booleans, and 0 to reset it to false, which is also valid. Next, in the Parameters settings, we’re using all three types for demonstration. Though in practice, it would make more sense to just use a Tag Group for the Turbines folder, since this will already include all the tag value events we need. The first parameter is a Single Tag for power_output in WT-1001. Then we have a Tag Group for all tags within WT-1002. Finally, we use the Multiple Tags Filter to include all tags under WT-1003, except for grid_connected and operating_mode tags. Now, let’s move on to the Query Script. Here, we’re simply adding some debug logging using the $.logger object. This line writes the incoming parameters to the debug log in JSON format, so we can easily inspect the data being passed into the transaction. Now that we’ve defined the table and columns, we move on to extracting the tag data from the parameters.

[07:00] This part of the script loops through all parameters, whether they’re single tag data objects or arrays of tag data objects, and flattens them into one array called parameterValues, making it easy to process everything in the same way. You can access tag data object values through the parameters by key, using dot notation. Next, the parameterValues array is passed into the composeRowValues() function. For each tag object, we extract the tag path, value, timestamp, and quality. We split the tag path to extract the turbine ID and metric name from it. The timestamp is formatted as an ISO string. Then, depending on the value type, number, string, or Boolean, we place it in the corresponding column, setting the others to NULL. Finally, we build a SQL value tuple for each row and return the array of tuples, ready to insert into the database. Now we move to the insert() function, which brings everything together. First, it calls getValuesFromParameters() to collect all tag data, and then composeRowValues() to generate the value tuples.

[08:00] It also builds the list of column names using the join method on the COLUMNS array. Finally, it assembles a complete INSERT statement, including the column names and all rows, and returns it as a SQL string ready to be sent to the database. The last step is to call the insert() function and assign its result to $.output, so the complete SQL INSERT statement can be sent to the database. We also log this full SQL payload to the debug log, which is useful to verify exactly what will be executed. Now let’s close the code editor and manually trigger the execution of this transaction, so we can observe what happens in a controlled way. Let’s write a value of true to the Trigger tag to activate the transaction. Next, let’s check the debug log to see the parameters passed to the transaction. As we can see, it’s a JSON object where each key matches the name of a parameter. The Single Tag is represented as a single tag data object, while the Tag Group and Multiple Tags (Filter) are arrays of tag data objects.


[09:00] On the other hand, the output of the transaction is a simple string, a regular SQL INSERT statement, with the column names, and the actual row values already filled in, ready to be executed by the database. Next, let’s run a SELECT query in another SQL client to check the values that were inserted into the database. As we can see, everything worked correctly — the values were inserted as expected.
Now that everything looks good, let’s set up a Periodic Trigger so the transaction will automatically insert tag values into the database at a defined rate. After creating the new trigger, let’s check the insertions again in another SQL client to confirm that the data is being written automatically. Once again, everything looks good. As we can see, depending on each tag’s data type — whether numerical, string, or boolean — the values are inserted correctly into their corresponding columns.

[10:00] And that wraps up this video. We recommend visiting N3uron’s Knowledge Base for more detailed information on transactions and SQL integration. Thanks for watching, and see you in the next video!

Privacy Settings
We use cookies to enhance your experience while using our website. If you are using our Services via a browser you can restrict, block or remove cookies through your web browser settings. We also use content and scripts from third parties that may use tracking technologies. You can selectively provide your consent below to allow such third party embeds. For complete information about the cookies we use, data we collect and how we process them, please check our Privacy Policy
Youtube
Consent to display content from - Youtube
Vimeo
Consent to display content from - Vimeo
Google Maps
Consent to display content from - Google
Spotify
Consent to display content from - Spotify
Sound Cloud
Consent to display content from - Sound
Contact Us
Download N3uron