Email Us at [email protected]

Download N3uron

RESOURCES / ARTICLES

Transforming Industrial Data Access with N3uron, TimescaleDB and GraphQL

The GraphQL logo displayed alongside Hasura, Docker, Timescale, and N3uron logos with a battery plant underneath them.

Overview

The Unified Namespace (UNS) architectural approach to data management is proving to be a game-changer for industrial companies aiming to establish robust foundations for their digital transformation initiatives. It offers a single source of truth for all data and information of the organization by providing a single, centralized hub for all data sources and consumers to connect through.

Although UNS is an architectural paradigm not tied to any specific technology, MQTT is by far the most commonly used protocol for implementing UNS. MQTT is known for being lightweight, scalable, using an event-driven pub-sub messaging system and efficiently managing data transfers between nodes by implementing report-by-exception.

While the UNS is a great solution for providing access to the current state of the business by making all real-time data and information accessible across the entire organization, it still does not allow data to be used to its full potential as it lacks a simple and frictionless access mechanism to historical data. Therefore, UNS needs to be expanded with a parallel framework to unlock the full potential of operational data and meet the analytical needs of companies to make informed decisions.

Infographic illustrating the concept of Unified Namespace (UNS) from the perspective of the water, solar, wind, batteries, oil, and gas industries.

The parallel framework should seamlessly integrate with UNS to enable interoperability among historical data sources and consumers, historize each event in a digital infrastructure to a long-term, durable and scalable storage system and then provide a single endpoint for data retrieval using open and standard protocols. Thus, it would serve as a “historical” data hub for analytics, machine learning and other data-intensive workloads.

This article explores the details of integrating historical data into the UNS and proposes a solution to address this specific challenge using the N3uron IIoT and DataOps Platform, Timescale and Hasura to build a scalable and cost-efficient “historical data hub” for any UNS , accessible using standard SQL and a GraphQL API.

Illustration of integrating historical data into UNS using Timescale and Hasura. The image features data flow diagrams, Timescale, Hasura logos, and GraphQL API integration.

Getting started

To get started you’ll need a working Docker environment installed on your computer. If you don’t already have Docker installed, you can follow the instructions provided on the official Docker documentation at https://docs.docker.com/get-docker/.

Once you have Docker installed and running, you can proceed to clone our demo repository using Git. If Git is already installed on your computer, you can use the following command to clone the repository to your local machine:

$ git clone https://github.com/n3uron/demos

You’ll find the material for this article at “/2024/02/timescale-graphql” inside the repository.

If you don’t have Git installed or prefer to download the repository as a ZIP file, you can do so by visiting the GitHub repository at https://github.com/n3uron/demos and selecting the Download ZIP option from the Code dropdown menu. Once the ZIP file has finished downloading, you can extract its contents to a directory on your local machine.

Screenshot of N3uron's GitHub repository page showing how to download the repository as a ZIP file.

Introducing the Components

Before we get into the steps, let’s briefly introduce the components comprising our stack:

  • N3uron is a lightweight industrial platform for IIot and DataOps that runs on devices at the edge of the network. We’ll use the OPC UA Client and SQL Client modules to collect data from our public datasim.n3uron.com endpoint and store it inside the database.
  • TimescaleDB is a PostgreSQL-based time-series database designed to efficiently store and query data. It provides automatic partitioning, data compression and real-time aggregation to scale even the most demanding workloads.
  • Hasura is an open-source solution to instantly create a GraphQL API over your data. It connects to the Timescale database, builds the schema and exposes the data through a GraphQL API.
Infographic showing Hasura and GraphQL integrating client apps, public APIs, and data sources.

Deploy with Docker

Open a terminal at the timescale-graphql directory inside the repository and deploy the stack using Docker Compose:

$ docker compose up -d

Use the `docker ps` command to display the running containers, their status and exposed ports.

Screenshot showing the running containers, their status and exposed ports displayed in a terminal.

Configure N3uron

To begin with the configuration, open http://localhost:8003 in your browser to access the N3uron WebUI and proceed to log in with the default credentials:

  • Username: admin
  • Password: n3uron
Screenshot displaying the N3uron's WebUI's login page.

Create and configure the OPC UA Client

Once inside N3uron, we’ll create an OPC UA Client module to retrieve the data used for this demo from our public Datasim endpoint, which is an instance of our OPC UA Server.

  • Step 1: Navigate to Config > Modules and create a new module named OpcUaClient.
Screenshot showing how to create a new OPC UA Client module.
  • Step 2: Select OpcUaClient as the module type and click Save to instantiate the new module.
Screenshot diplaying how to instantiate a new OPC UA Client module.
  • Step 3: Proceed to save both the Logger and Module configuration.

Screenshot showing how to save the Logger configuration.
Screenshot displaying how to save the Module configuration.
  • Step 4: To create a new client, click on the burger menu inside Model, then on New Client and name it Datasim.

Screenshot showing how to create a new OPC UA Client.
  • Step 5: Configure the Endpoint URL with the address of our public Datasim server -> datasim.n3uron.com:4840

Screenshot displaying how to configure the Endpoint URL with the address of N3uron's public Datasim server.
  • Step 6: Open the “OPC UA server discovery” dialog to explore the remote server and select None:None.

Screenshot showing the OPC UA Server Discovery dialog to explore the remote server.
  • Step 7: Enable and configure authentication with the following credentials:

    • User: sunn3rgy
    • Password: n3uron
Screenshot displaying how to enable and configure authentication with credentials.
  • Step 8: Click on Save and reload the module to apply the new configuration.

Screenshot showing how to apply the new configuration by saving and reloading the module.
  • Now we can use the built-in OPC Browser to explore the data model exposed by the OPC server.

    Step 9: Navigate to OpcUaClient->OPC Browser and select the previously configured Datasim client.

Screenshot displaying how to select the previously configured Datasim client.
  • Step 10: Use drag-and-drop to create tags in N3uron from the data available at the OPC UA server. The GIF below displays the process.

GIF showing how to create tags in N3uron from the data available at the OPC UA Server.
  • Step 11: Go back to the Real-Time section to visualize all the data signals received (and changing) from the remote server.

Screenshot displaying the Real-Time section to visualize all the data signals received from the remote server.

Integrate TimescaleDB with the SQL Client

In this section, we’ll create a SQL Client module to historize everything in the TimescaleDB instance running in a separate container.

Our SQL Client has preset configurations for TimescaleDB, leveraging the power of Hypertables with built-in partitioning and data compression for time-series data to efficiently scale our historian.

  • Step 1: Go back to Config->Modules and create a new module, name it SqlClient.
Screenshot showing how to create a new SQL Client module.
  • Step 2: Select SqlClient as the module type and save its default configuration as shown in the first section of this article.
Screenshot displaying how to instantiate a new SQL Client module.
  • Step 3: Create a New Channel and name it TimescaleDB

Screenshot showing how to create a new channel for TimescaleDB.
  • Step 4: Proceed to configure the database connection, first, select TimescaleDB as the database type.

Screenshot displaying how to select TimescaleDB as the database type.
  • Step 5: Set the following property fields as follows:

    • Connection:
      • Host: timescale
      • Port: 5432
      • Default database: n3-history
    • Authentication:
      • Username: postgres
      • Password: n3uron
Screenshot showing how to set the Connection and Authentication properties.
  • Step 6: Click on Transactions and select New HistoryTransaction. A HistoryTransaction transforms any SQL database into a data historian, it subscribes to all events at the configured path, buffers them in memory and inserts them in batches to the target database.

Screenshot displaying how to select a New History Transaction.
  • Step 7: Configure a new trigger of type Periodic, this trigger tells the SqlClient when to insert the buffered events into the database.

Screenshot showing how to configure a new trigger of type Periodic.
  • Step 8: For this demo, we’ll use a Periodic trigger of type Fixed interval and keep the default values.

Screenshot displaying how to select Fixed interval type for a Periodic trigger
  • Step 9: Edit the query builder script at Query->Options->Script to select “timescale” as the query syntax.

Screenshot showing how to select Timescale as the query syntax.
Screenshot displaying how to edit the query builder script.
  • Step 10: As the final step in the SqlClient configuration, we need to create a New TagFilterPath, a tag filter is used to select which tags from the data model will be historized.

Screenshot showing how to create a new tag filter path.
  • Step 11: Click on Save and reload the module to apply the new configuration.

Screenshot displaying how to apply the new configuration by saving and reloading the module.
  • To verify the module is working as expected and transactions are being executed against the database we can use the built-in Diagnostics feature of N3uron to subscribe and watch the logs emitted by the module in real-time.

    Step 1: Go to Diagnostics->Realtime logs and select the SqlClient module.

Screenshot showing the Real-Time logs section in the built-in Diagnostics feature.
Screenshot displaying how to select the SQL Client module.
  • Step 2: Mark the Enabled checkbox and wait a few seconds for the logs to appear. Make sure the log level is set to DEBUG.

Screenshot showing how to show the real-time logs.

Create a GraphQL API with Hasura

Hasura is an open-source project that simplifies and accelerates the process of building scalable and real-time applications by automatically generating a GraphQL API from the existing databases. Thus, allowing to decouple databases from client applications.

We’ll configure Hasura in two simple steps to generate a GraphQL API to access the historical data inside TimescaleDB.

Open http://localhost:8080 to access the Hasura container.

  • Step 1: Navigate to the Data page.
Screenshot displaying how to navigate to Hasura's Data page.
  • Step 2: Open public inside the default database and click Track on the history table. This tells Hasura to auto-generate the GraphQL schema for the table.
Screenshot showing where to click Track on the history table.
  • Step 3: Navigate back to the API page.

Screenshot displaying how to navigate to Hasura's API page.
  • Step 4: Use the built-in query tool to build, generate and run GraphQL queries against your API.

    Here is an example query:

query GetHistory {

  history(where: {tag: {_eq: “/PVSIM/BLUELAKE/PVG001/PST_10/INV001/ACTIVE_POWER”}, _and: {quality: {_eq: “192”}}}) {

    number_value

    ts

  }

}

Screenshot showing how to build, generate and run a GraphQL query against your API.

The following query uses the real-time subscription feature of GraphQL to receive changes from the server instead of constantly polling.

subscription GetHistory {

  history(where: {tag: {_eq: “/PVSIM/BLUELAKE/PVG001/PST_10/INV001/ACTIVE_POWER”}, _and: {quality: {_eq: “192”}}}) {

    number_value

    ts

  }

}

Screenshot displaying the real-time subscription feature of GraphQL.

Conclusion

In summary, we have demonstrated the potential of GraphQL to integrate Hasura, and TimescaleDB with the N3uron IIoT and DataOps platform to create a powerful, scalable, and efficient industrial data management solution.

By leveraging N3uron’s capabilities for data collection and standardization, TimescaleDB’s robust time-series storage, and Hasura’s instant GraphQL API generation, organizations can transform their data management and analytics capabilities

This solution enables real-time and historical data access in a standardized and scalable manner, paving the way for advanced analytics, machine learning, and other data-driven applications.

Go ahead and try it yourself! Download N3uron today and accelerate your digital transformation.


Daniel Paesa

Customer Success Engineer at N3uron Connectivity Systems. Daniel Paesa is a self-taught computer programmer interested in Cloud Computing, Cybersecurity and Distributed Systems. Since joining N3uron Connectivity Systems in 2023, has worked to provide our customers with modern container and cloud-based IIoT solutions.

    Sign-up for our newsfeed and get blog updates sent straight to your inbox.

    By clicking “Sign Me Up,” you agree to the Term of Use and the submission and processing of your data. Privacy Policy.

    Want to stay up-to-date with us?

    Sign up for our News Feed.

    By clicking "Sign Me Up", you agree to the Terms of Use and the submission and processing of your data. Your privacy is important to us. We will never sell or rent your information. Privacy Policy.

    Privacy Settings
    We use cookies to enhance your experience while using our website. If you are using our Services via a browser you can restrict, block or remove cookies through your web browser settings. We also use content and scripts from third parties that may use tracking technologies. You can selectively provide your consent below to allow such third party embeds. For complete information about the cookies we use, data we collect and how we process them, please check our Privacy Policy
    Youtube
    Consent to display content from - Youtube
    Vimeo
    Consent to display content from - Vimeo
    Google Maps
    Consent to display content from - Google
    Spotify
    Consent to display content from - Spotify
    Sound Cloud
    Consent to display content from - Sound