Docs

Project settings

Aside from the specific settings for subjects and metrics there are also general project settings. These settings can be found under the general section in the settings menu. On this page you can find what the settings are about.


General

The general settings apply for each and every person in the project. So, changing the Theme will result in a Dark or Light theme for everyone. This is what you can set:

FieldDescription
NameThe name of the project so you can easily find it among your list of projects.
TimezoneThe timezone of the project. Be aware that dates are always displayed in the timezone of the project which might differ from the timezone of your computer.
Description (optional)You can add a description to tell (new) project members what your project is about.
ThemeYou can choose a Dark or Light theme, cool right?

SSO settings

This setting is only available once SSO is configured at your organization’s SSO settings. For first time users, a default role can be defined to determine the level of access these users are granted.

Leave project

You can remove yourself as a member of the project. Be aware of the fact that you will lose access to the project right after you’ve hit the button ‘Leave project.’ If you want back in, you will need someone else to invite you to the project.

Change owner

This action can only be executed by the Owner of the project. As an owner of the project, you can transfer the project to someone else within the project. Especially useful when want to give someone else responsibility for the project when you are on a holiday or a well-deserved sabbatical of 3 months.

Remove project

This action can only be executed by the Owner of the project. As an owner of the project, you can remove the project and all its resources. This is a dangerous action, so be very, very, very careful with this action.

Here is an example of the General settings of the Blockbax Playground project:

Project settings


Roles

Roles give control on what kind of access you grant to the members of your project. From default roles to fine-grained custom roles, it’s all configurable at the project settings.

Default roles

Each project comes with default roles with fixed permissions. The permissions for these roles are the ones we see most often.

RolePermissions
OwnerFull access
AdministratorFull access but no possibility to remove a project or transfer ownership
ExpertRead-only, with write access to event triggers and notification settings. No access to project settings and usage
ObserverRead-only, with only write access to own notification settings. No access to project settings and usage

Custom roles

Custom user roles can be used to define more fine-grained permissions for your members compared to the default roles.

Configurable permissions
ResourcePermissionFilter(s)
Actions (beta)RunAll or include certain actions
DashboardsView, Edit, ManageAll or include certain dashboards
Inbound connectorsUseAll or include certain inbound connectors with an additional filter to select for which subjects (all, include and/or exclude) it can be used. When an inbound connector is set to automatically create subjects, only subjects that match the selection can be created.
Event triggersView, Edit, ManageAll or include certain event triggers
PropertiesView, EditAll or include certain properties
SubjectsView, Edit, ManageAll, include and/or exclude certain subjects
View, edit and manage

The platform provides three permission levels.

LevelPermission
ViewThe resource is only viewable (read-only) for this role
EditThe resource is viewable and editable (write access) for this role
ManageThe resource is viewable, editable, creatable and removable for this role
Important to keep in mind:
  • On a dashboard the resource access to event triggers, properties and subjects is still applied. Consequently providing just dashboard permission has no effect when the user has no permissions to the underlying resources.
  • To edit a resource you need to have at least view permissions to the other resources used. For example in order to edit an event trigger you need to have at least view permissions to the subjects in scope and/or properties used.
  • Editing subjects applies only to its properties where the user has edit permissions to.
Include and exclude subject permissions based on properties

The behavior of including and excluding subjects based on a property differs from selecting individual subjects.

Filter typeResult
IncludeThe user gets access to the subjects that match all properties
ExcludeThe user gets no access to the subject that matches one of the properties
Non-configurable permissions

Some resource permissions underlying certain sections in the interface are not configurable, but are derived or only accessible via system-defined roles.

SectionPermissionExplanation
Subject typesDerivedAll subject types are viewable that are related to the subjects you have access to
EventsDerivedEvents are visible for the subject(s) and event trigger(s) you have access to
NotificationsDerivedNotifications are constrained to the subject(s) and event trigger(s) you have access to
ExplorerDerivedYou can always use the Explorer for the resources you have permission to
UsageOwner and admin onlyThe default owner and admin role have permission to this section
Project settingsOwner and admin onlyThe default owner and admin role have permission to this section

Here you see a Blockbaxer creating a custom role.

Creating custom role


Members

Member management empowers you to keep an eye on the people and their permission in your project. At this part of the platform you can easily invite new people for your project, change roles for your project members and remove an account from the project.

The owner of a project can create an invite link in the members section. This makes it easy to invite people to the project without knowing and typing in their email-addresses.

People that use the invite link to join the project will automatically get the observer role.

The invite link can be revoked by the owner. The invite link is invalid after revoking and can not be used anymore. The owner can always create a new one, but never activate an old one again.


Properties

Properties can be used to label subjects. This is what you need to configure when adding or editing a property:

AttributeDescription
NameGive the property a descriptive name so you can easily recognize and find it.
Data typeChoose whether the value of the property is of type text, image, number, location, map layer or area.
ValuesProvide pre-defined values or let project members come up with values they want.

Access keys

Access keys are needed to authenticate with our APIs. You can easily create one by giving the key a descriptive name to easily recognize and find it. Next, you can set the permission to constrain the person or system on what it can do with your data. Once created, you have to copy the key information and use it straight away, because you won’t be able to see it again.

Permission setDescription
Full accessFull access, the key can be used to read and write all data
Only send measurementsPartial access, the key can be used to send new measurements using all inbound connectors.
Read onlyRead-only, the key can be used to read all data.
CustomCustom permissions can be assigned for more fine-grained control. Additionally to the permissions configurable for roles the permissions for webhooks is also available to give access to the webhooks API.

Here you see a Blockbaxer creating an access key with specific subject permissions.

Creating access key


Webhooks

Webhooks are developed to send events to other systems in real-time. You can see it as a reversed API: provide your endpoint and the platform makes sure you get the events you are interested in once they occur. The following information needs to be present to make use of the webhook.

General information

FieldDescription
StateActive or Inactive
NameA descriptive name to recognize and find your webhook
EndpointThe URL where the platform needs to POST the events to.
Authorization headerProvide an optional header to authorize the webhook for your external system

Event levels

Here you can select the event levels for which the webhook should be called. For example, you might only want to see Problem and Warning events being send to your external system.

Event triggers

Here you have some filter options to only have the webhook called for certain event triggers.

Subjects

Here you have some filter options to only have the webhook called for certain subjects.

The structure of the JSON message that is sent to the endpoint and more details can be found in the integrations section of the docs.

Configuring webhooks

Inbound connectors

Inbound connectors are the integrations to ingest measurements into the Blockbax Platform. You can find the configuration and logs of your project’s default connectors here and you can create custom connectors. This is very useful when either the ingestion method or payload format is different from the standard HTTP, MQTT or CoAP connectors.

General information

FieldDescription
StateActive or Inactive
NameA descriptive name to recognize and find your inbound connector
ProtocolMeans of sending or fetching measurement payloads.
Auto-create subjectsAutomatically creates a subjects based on the incoming payload
Protocol

Protocols are divided in two categories: data that gets sent to Blockbax (push) and data that Blockbax fetches from an external server (pull).

The following push-based protocols are available:

HTTP POST

Convert payloads sent to: https://api.blockbax.com/v1/projects/<PROJECT_ID>/measurements?inboundConnectorId=<INBOUND_CONNECTOR_ID>. Further details about integrating over HTTP can be found here.

MQTT publish

Convert payloads sent to topic: v1/projects/<YOUR_PROJECT_ID>/inboundConnectors/<INBOUND_CONNECTOR_ID>/measurements. Further details about integrating over MQTT can be found here.

CoAP POST

Convert payloads sent to: coaps://coap.blockbax.com/v0/projects/<PROJECT_ID>/measurements?apiKey=<API_KEY>&inboundConnectorId=<INBOUND_CONNECTOR_ID>. Further details about integrating over CoAP can be found here.

The following pull-based protocols are available:

MQTT subscribe

Convert payloads that will be retrieved from an external MQTT broker, the following configuration options are available:

FieldDescription
Broker hostThe broker host to connect to, e.g. mqtt.example.com
Broker portThe broker port to connect to
TopicThe topic to subscribe to, e.g. room/+/temperature/+
Client IDThe client ID the client will connect with
UsernameThe username that the client will connect with
PasswordThe password that the client will connect with
Kafka consume

Convert payloads that will be retrieved from an external Kafka broker, the following configuration options are available:

FieldDescription
Bootstrap serversA comma seperated list of the bootstrap servers, e.g. my-bootstrapserver-1.com:9026,my-bootstrapserver-2.com:9026
TopicThe topic that the inbound connector will consume from
Consumer group IDThe consumer group ID that the client will use
AuthenticationAuthentication method to use, SASL/PLAIN and SSL are supported (this translates to the SASL_SSL and SSL value for the security.protocol Kafka client setting)

Configuration options when authenticating using SASL/PLAIN:

FieldDescription
UsernameThe username that the client will connect with
PasswordThe password that the client will connect with

Configuration options when authenticating using SSL:

FieldDescription
Client certificateThe client certificate that the client will use to connect with (has to be in the pfx format)
Trust store certificate chainThe server certificate that will have to be trusted by the client (has to be in the pem format)
Azure Event Hub consume

Convert payloads that will be retrieved from an external Kafka broker, the following configuration options are available:

FieldDescription
NamespaceThe prefix of the Azure Event Hub Namespace, e.g. my-namespace
TopicThe topic that the inbound connector will consume from
Consumer group IDThe consumer group ID that the client will use
Shared access key nameThe name of the access key that the client will connect with
Shared access keyThe access key that the client will connect with
AWS Kinesis consume

Convert payloads that will be retrieved from an external Kafka broker, the following configuration options are available:

FieldDescription
RegionThe AWS region where the Kinesis stream is located
Stream nameThe name of the stream
Consumer group IDThe consumer group ID that the client will use
Access key IDThe ID of the access key that the client will connect with
Secret access keyThe secret of the corresponding access key
Auto-create subjects

To map measurements to subjects and metrics you can specify your own IDs. We call these ingestion IDs, by default these are derived from the subjects’ external IDs and metrics’ external IDs (e.g. MyCar$Location) but you can also override these with custom ones. When you use this option a subject is created if its external ID derived from the first part before the $ sign of the ingestion ID does not exist (i.e. for the ingestion ID MyCar$Location a subject with external ID MyCar will be created if the metric with external ID Location can be linked to exactly one subject type).

Payload conversion

In addition to a protocol an inbound connector requires specifying a payload conversion. The payload conversion is a script that is used to transform an arbitrary received payload to measurements that are ingested in the Blockbax Platform. A payload conversion has an expected payload format and is defined as a plain JavaScript function.

FieldDescription
Payload FormatThe type of payload format that is sent to Blockbax. The received payload bytes are decoded based on the specified format.
ScriptA JavaScript (ECMAScript 2022 compliant) function with signature convertPayload(payload, context) { ... } is used to convert the payload to measurements
Payload formats

Currently, four different payload formats are supported by Inbound Connectors. Payloads must always be sent to the inbound connector as bytes, but the format specifies how these bytes are decoded and passed to the payload conversion function.

Payload FormatDescriptionPassed to convertPayload as
JSONReceived payload is decoded as UTF-8 JSON stringJavaScript Object
CBORReceived payload is decoded following the CBOR specificationJavaScript Object
StringReceived payload is decoded as UTF-8 StringJavaScript String
BytesReceived payload is passed as receivedJavaScript ArrayBuffer
AvroReceived payload is decoded according the the schema in the specified schema registry. This is exclusive to the Kafka consume connectorJavaScript Object
Payload conversion script

The payload conversion script is a user-defined script that is used to transform the decoded payload to measurements that are to be ingested by the Blockbax Platform. Additionally, log messages can be generated to provide feedback to the user from the payload conversion script.

The payload conversion is defined as a JavaScript (ECMAScript 2022 compliant) function with required signature convertPayload(payload, context) { ... }. The web client provides a convenient editor with auto-completion and method signatures on hover. An example script is shown below:

function convertPayload(payload, context) {
    const ingestionIdPrefix = payload.id + "$";
    const timestamp = date(payload.timestamp);
    for (const [key, value] of Object.entries(payload.data)) {
        context.addMeasurement(ingestionIdPrefix + key, value, timestamp);
    }
}
Payload Conversion

This script takes a JSON type payload and uses a top level property for prefixing the ingestion ids. The timestamp for all measurements contained by the payload is parsed from the top level object as well. The script then iterates over all key value pairs in the data field to ingest them as measurements.

Payload parameter

The payload parameter is the decoded payload sent to the inbound connector. This parameter can be either a JavaScript object, a string or an ArrayBuffer. The type of the payload depends on the specified payload format. The payload object is immutable and will throw a TypeError on modification.

Context parameter

The context parameter is an object used to pass measurements and logs to the Blockbax Platform. The context parameter cannot be modified. Its functionalities are described below:

addMeasurement(ingestionId, value, date)

Rounds decimal numbers down to the closest integer value.

ParameterDescription
ingestionIdIngestion ID of the target series for the measurement.
valueEither a number, string or a location object ({"lat": <number>, "lon": <number>, "alt": <number>}). Note that the alt field of the location object is optional. You can use the location(lat, lon, alt?) library function to create a location object.
date (optional)Date object with the date of the measurement. You can use the date(input, format?) library function to create a date object from various input formats.
logInfo(msg)

Logs a user message at INFO level.

ParameterDescription
msgA string representing the message to be logged.
logWarn(msg)

Logs a user message at WARN level.

ParameterDescription
msgA string representing the message to be logged.
logError(msg)

Logs a user message at ERROR level.

ParameterDescription
msgA string representing the message to be logged.

Optionally when the MQTT subscribe connector is selected, it will also have the following functionality:

getTopic()

Returns the topic on which the payload was received. E.g. when subscribed to topic temperature/#, the function could return temperature/room-1/device-1.

Library functions

Several library functions are available when writing your conversion function to make writing a conversion easier and more concise. The available functions are described below.

number(value)

Parses provided input string value to a JavaScript number and rounds it to 8 decimal places if needed.

ParameterDescription
valueA string value containing a number.
location(lat, lon, alt?)

Creates a location object from the provided latitude, longitude and optionally altitude.

ParameterDescription
latNumber value representing the latitude of the location in degrees.
lonNumber value representing the longitude of the location in degrees.
alt (optional)Number value representing the altitude of the location above the earth surface in meters.
date(value, format?)

Attempts to parse the provided value to a JavaScript date object. If a date format string is provided it will use the date format string.

ParameterDescription
valueIf no format is specified: An input value containing a Unix timestamp in milliseconds or seconds since Epoch, or an ISO8601 string. If a format is specified: a date string that matches the provided date format string.
format (optional)A date format string, or an array of date format strings if multiple formats could apply.

Without a format specified this function accepts by default:

  • Unix timestamps in milliseconds (13 digit number, since epoch 1 January 1970 00:00 UTC)
  • Unix timestamps in seconds (10 digit number, since epoch 1 January 1970 00:00 UTC)
  • ISO8601 strings

For the date format the following parsing tokens are supported:

InputExampleDescription
YY01Two-digit year
YYYY2001Four-digit year
M1-12Month, beginning at 1
MM01-12Month, 2-digits
MMMJan-DecThe abbreviated month name
MMMMJanuary-DecemberThe full month name
D1-31Day of month
DD01-31Day of month, 2-digits
H0-23Hours
HH00-23Hours, 2-digits
h1-12Hours, 12-hour clock
hh01-12Hours, 12-hour clock, 2-digits
m0-59Minutes
mm00-59Minutes, 2-digits
s0-59Seconds
ss00-59Seconds, 2-digits
S0-9Hundreds of milliseconds, 1-digit
SS00-99Tens of milliseconds, 2-digits
SSS000-999Milliseconds, 3-digits
Z-05:00Offset from UTC
ZZ-0500Compact offset from UTC, 2-digits
AAM PMPost or ante meridiem, upper-case
aam pmPost or ante meridiem, lower-case
Do1st… 31stDay of Month with ordinal
X1410715640.579Unix timestamp
x1410715640579Unix ms timestamp
Testing payload conversion scripts

Payload Conversion scripts can be tested in the Blockbax Web Client. Depending on the payload type a test payload can be provided, and the conversion script is then executed for that payload. For the JSON, CBOR, String and Avro types, a string can be provided as payload. For the CBOR and Bytes type, a hex string can be provided. The test payload can be saved with the conversion script. This allows doing basic regression testing when making changes to the script and verifying that the output measurements are correct for that payload. When the conversion script is executed for the payload the web client will show the measurements and logs that were produced by the payload conversion. Please note that this is only a visual representation of the outcome of the payload conversion, no actual measurements are ingested. Any problems with the script or resulting measurements are shown as log messages.

Payload testing

Outbound connectors

Outbound connectors give you the opportunity to stream your data out of the Blockbax platform and into your own infrastructure. You are able to configure two outbound connectors, one for events and the other for measurements. when these are set up, data is forwarded whenever an event is triggered or measurement is received. There are two protocols to choose from: either have your data produced to an Azure Event Hub or to a Kafka cluster.

Kafka produce

Data can be produced into your Kafka cluster. For authentication there are 2 options: SASL/PLAIN (username and password) and SSL (certificates), one of which needs to be supplied. All the settings can be seen below:

FieldDescription
Bootstrap serversA URL to the desired namespace
TopicThe topic you would like to have your data streamed to
Username (SASL/PLAIN)The username belonging to the Kafka user that Blockbax will connect with
Password (SASL/PLAIN)The password belonging to the Kafka user that Blockbax will connect with
Client certificate (SSL)The certificate (pfx format) that contains the client certificate
Truststore certificate chain (SSL)The certificate (pem format) of the certificate that the broker uses
Outbound connector creation

Azure Event Hub produce

The data can be produced to your Azure Event Hub, the connector will need the following settings:

FieldDescription
NamespaceThe prefix of your Azure Event Hub namespace
TopicThe topic you would like to have your data streamed to
Shared access key nameThe name of your access key
Shared access keyThe access key needed for permissions to stream data into your infrastructure
Outbound connector creation