Documentation.

Integrations.

Integrations are needed in order to let your data flow in and out of the platform. Most importantly you want to make sure your sensor data continuously flows into the platform. To realize this you can use our built-in connectors for a range of commonly used technologies mentioned below.

Additionally you may have other systems, like a work order management system, that you want to integrate. To achieve this you can make use of our HTTP (REST) API or receive events in real-time via webhooks.

Sending measurements

There are multiple ways of sending measurements. Measurements can be sent in batches over HTTP, can be streamed over MQTT or published over CoAP. Another option is to install the agent in your own environment which can read several commonly used protocols (like OPC, BACnet, Modbus or I²C). Before we dive into the different protocols, we first like to make an important note about measurement time to take in mind when setting up an integration.

Important note about measurement time

We discourage rewriting history as it can become hard to trace why something led to an event. In order to achieve this we recommend that subjects have a strict increasing time order over measurements being sent to the platform. On a series level we go one step further and enforce this time order. We also disallow sending measurements with a future timestamp as timestamps in the distant future could effectively render a series unusable as history can never be rewritten.

To be specific, a measurement is marked invalid when:

  • The timestamp is identical (exact same timestamp in milliseconds) within a series.
  • The timestamp is older compared to the latest measurement within a series.
  • The timestamp is more than thirty minutes in the future.

These best practices and requirements prevent scenarios where old measurements coming in would be able to invalidate an event. As an event could already led to e-mails and webhook calls, invalidating would become highly impractical and hard to follow.

The following example of messages describes how this works in practice:

Message order Series Measurement timestamp Validity
1 Temperature of subject #1 2019-05-11, 09:15 Valid
2 Temperature of subject #1 2019-05-11, 09:17 Valid
3 Temperature of subject #1 2019-05-11, 09:16 Invalid
4 Humidity of subject #1 2019-05-11, 09:15 Valid, but not recommended
5 Temperature of subject #1 2019-05-11, 09:18 Valid

Important to note is that in the example each measurement is sent as a separate message. Better is to make smart use of batching measurements. Within a single message the platform fixes the sorting for you. In this way, when you batch the measurements of all metrics for each subject and one or multiple timestamps, you are able to achieve strict increasing time order per subject. In the example this would mean that at least the first and the fourth message would be batched in the same message.

The platform filters out the invalid measurements but invalid measurement will count for your usage statistics. Make sure you avoid sending invalid measurements to prevent unpleasant surprises in your usage details.

HTTP

Batches of up to 500 measurements can be send via our HTTP API. To map measurements to subjects and metrics you can specify your own IDs. We call these ingestion IDs, by default these are derived from the subjects’ external IDs and metrics’ external IDs (e.g. subjectExternalId$metricExternalId) but you can also override these with custom ones. More information can be found in the API documentation.

MQTT

MQTT is a reliable, lightweight protocol with a small code and network bandwidth footprint. The platform provides a native integration to publish measurements over MQTT. In order to connect a MQTT client the following configuration can be used:

Property Value
Broker host mqtt.blockbax.com
Port number 8883 (default secure MQTT port)
Protocol Only TLS version 1.2 connections are accepted.
Username The public ID of your access token.
Password The secret of your access token.
Client ID Must be prefixed with the public ID of your access token. Only one connection is allowed per client ID, when a connection is attempted with another open connection it is refused until the open connection is properly closed or times out after five seconds.
Topic v1/projects/<YOUR_PROJECT_ID>/measurements. Note: Your project ID can be found in the URL when you open your project in the web app, e.g. app.blockbax.com/projects/40edc099-7a41-4af3-9fa4-2fa4ac23a87a/

Messages need to be a JSON UTF-8 encoded string and need to contain the ingestion ID, either derived from the subject’s external ID and metric’s external ID (e.g. subjectExternalId$metricExternalId) or a custom one that has been defined. Additionally, it needs to contain one or more measurements with the date (as epoch timestamp in milliseconds or as ISO 8601 date string) and a number or location value. Providing a date is optional and the maximum supported date is 9999-12-31 23:59:59.999 UTC. If no date is provided the receive time of the server will be used. This is an example of a valid message:

{
  "series": [
    {
      "ingestionId": "MyHouse$LivingRoomTemperature",
      "measurements": [
        {
          "date": "2019-04-09T12:00:00.000+00:00",
          "number": 20.3
        }
      ]
    },
    {
      "ingestionId": "MyCar$Location",
      "measurements": [
        {
          "date": 1563049273812,
          "location": {
            "lat": 51.9250768,
            "lon": 4.4735718,
            "alt": 12.1
          }
        }
      ]
    }
  ]
}

CoAP Beta

The Constrained Application Protocol (CoAP) is a simple protocol with low overhead, designed for devices with constrained resources such as wireless sensors operating on a battery. The Blockbax platform provides a native integration to publish measurements over CoAP. To connect your CoAP device the following configuration can be used:

Property Value
CoAP server URL coaps://coap.blockbax.com
Port number 5684 (default CoAP port)
Protocol Only DTLS version 1.2 connections are accepted. Note: DTLS is based on TLS 1.2 but uses UDP instead of TCP. No certificates or pre-shared keys need to be configured for the client. We use one-way authentication using X509 certificates.
URL Path /v0/projects/<YOUR_PROJECT_ID>/measurements. Note: Your project ID can be found in the URL when you open your project in the web app, e.g. app.blockbax.com/projects/40edc099-7a41-4af3-9fa4-2fa4ac23a87a/
Request method POST
Content type application/json
Query parameters apiKey=<ACCESS_TOKEN_SECRET>. Note: Your ACCESS_TOKEN_SECRET is the secret of your access token.

The body of the POST request is identital to the HTTP API. An example can be found in the HTTP API documentation.

An example of a valid request using the libcoap CLI client looks like as follows:

coap-client -m post -t "application/json" -f ./measurements.json "coaps://coap.blockbax.com/v0/projects/40edc099-7a41-4af3-9fa4-2fa4ac23a87a/measurements?apiKey=dYNpZq48BwBiCfdAmeDzFMRLv9HtyaSg"

Edge Agent

Another option to connect your sensor data is to install and configure the Blockbax Edge Agent. Our Agent is a lightweight reader and transforms measurements, often written in an industry specific protocol, to JSON and sends it over MQTT to our platform. The agent can be installed with one click on a Windows or Linux machine and is easy to configure. And, most importantly, it does not infringe on the existing OT infrastructure.

Currently we support OPC UA and OPC DA. We constantly improve and update the protocols that are implemented in the Edge Agent and we are working on support for other industry specific standards, like BACnet, Modbus and CANbus.

General configuration

The configuration of the Edge Agent is pretty straight forward. Download the Agent and you will find a configuration file where you need to fill in several input fields. What you have to fill in depends on the source protocol (e.g. OPC DA), but there is always some Blockbax specific information you have to fill in.

Configuration field Description
Project ID Your project ID can be found in the URL when you open your project in the web app e.g. app.blockbax.com/projects/40edc099-7a41-4af3-9fa4-2fa4ac23a87a/
Public ID The public part of the access token
Secret The secret part of the access token

Example configuration:

1
2
3
4
5
6
7
8
agent:
  project_id: "22a0e61c-22h9-454c-b8ad-fcebf4341ekg"
  access_token:
    public_id: "x9pwRj36"
    secret: "tqxOISYvyUNa6wMHTTafV4du03LN0fyA"

integrations:
  # configuration for specific integrations needs to be added here
OPC UA

OPC Unified Architecture (OPC UA) is a protocol used for communicating real-time data from devices mainly in industrial automation like for example PLCs, HMIs or SCADA systems. For OPC UA you can configure the options below.

Configuration field Description
Endpoint Address Server IP address or URL and port, for example opc.tcp://10.0.0.2:49320
Security Policy Choose between None (insecure), Basic128Rsa15 (deprecated), Basic256 (deprecated) and Basic256Sha256.
Security Mode Choose between None, Sign and SignAndEncrypt.
Authentication Type Choose between Anonymous, Credentials and Certificate.
Certificate File locations of certificate and private key to sign messages, encrypt communication and/or authenticate. If no certificate is provided the agent will generate one for you (which then need to be trusted by the server).
Credentials Username and password if authentication type is Credentials.
Fetch Type Choose between Subscription or ReadInterval.
Measurement Interval Rate of receiving measurements in case of fetch type Subscription or rate of polling in case of ReadInterval in milliseconds.
Node ID ID of a node on the server.
Ingestion ID ID which is set for the subject and metric combination which relate to this specific node.

Example configuration:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
integrations:
  opc_ua:
    - endpoint_address: "opc.tcp://10.0.0.2:49320"
      security_policy: "Basic256Sha256"
      security_mode: "SignAndEncrypt"
      authentication_type: "Credentials"
      credentials:
        username: "ExampleUser"
        password: "ExamplePassword"
      certificate:
        certificate_file: "example-certificate.pem"
        private_key_file: "example-private-key.pem"
      fetch_type: "Subscribe"
      measurement_interval_ms: 1000
      nodes:
        - node_id: "ns=2;s=Channel1.Device1.Sinus1"
          ingestion_id: "device1$sinus1"
        - node_id: "ns=2;s=Channel1.Device1.Ramp1"
          ingestion_id: "device1$ramp1"
OPC DA

OPC Data Access (OPC DA) is OPC UA’s predecessor. For OPC DA you can configure the options below.

Configuration field Description
Endpoint Address Server IP address, for example 10.0.0.2
Program ID Program ID of the server.
Measurement Interval Rate of of polling for measurements in milliseconds.
Tag ID ID of a tag on the server.
Ingestion ID ID which is set for the subject and metric combination which relate to this specific node.

Example configuration:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
integrations:
  opc_da:
    - endpoint_address: "10.0.0.2"
      program_id: "Kepware.KEPServerEX.V6"
      measurement_interval_ms: 1000
      tags:
        - tag_id: "Channel1.Device1.Tag1"
          ingestion_id: "device1$tag1"
        - tag_id: "Channel1.Device1.Tag2"
          ingestion_id: "device2$tag2"

Connecting your apps

You may have other systems, like a work order management system, that you want to integrate. To achieve this you can make use of our HTTP API. Additionally webhooks can be used to receive events in real-time.

HTTP API

The main Blockbax API is a HTTP API which provides programmatic access to all the platforms operations. This part of the documentation is described in a a different section and requires knowledge of HTTP API concepts. In the case of our API this means the URLs are resource-oriented and HTTP status codes are used to indicate success or failure. Data returned from our API will be in JSON format for all requests.

More info can be found in the separate API documentation.

Webhooks

Webhooks are developed to send events to your application(s) in real-time. You can see it as a reversed API: provide your endpoint and the platform makes sure you get the events you are interested in once they occur. The configuration possibilities via the web client can be found in the project settings section of this documentation. Another option is to configure webhooks programmatically via our HTTP API. The message coming from the webhook looks like this:

{
  "sequenceNumber": 25618,
  "deliveryAttempt": 1,
  "eventId": "06811617-4836-48d1-a09b-42e624db9ekg",
  "eventTriggerId": "cb99d792-e994-4a18-bf75-5d39f09d7ekg",
  "eventTriggerVersion": 10,
  "eventLevel": "WARNING",
  "subjectId": "ad55bc6a-094c-4f6b-9cfe-871168cfeekg",
  "metricId": "ekgc853c-2e61-494e-ba35-d2de3ff75581",
  "date": "2019-09-17T09:33:15.075+0000"
}
Retry logic

In case a call fails (due to receiving a non-2xx response code or exceeding timeout of 5 seconds), we will try 4 more times: after 5 seconds, 1 minute, 5 minutes and 15 minutes. If it still fails for each of those attempts, it is counted as a non-successful delivery. To make this transparent to the receiving system all messages contain the field deliveryAttempt which can be used to monitor at which delivery attempt the call was received. If no retries were needed deliveryAttempt is equal to 1. Retrieving calls that failed, for example to reprocess them after a long period of downtime, can be done programmatically via our HTTP API.

Ordering guarantees

Date ordering is guaranteed for all events related to a specific event trigger and subject. We enforce that calls are made in the right order, however the order of receiving can differ due to network problems and the asynchronous nature of the platform. To make this transparent to the receiving system all messages contain an incrementing sequenceNumber which can be used to derive the order in which messages were sent.