CoAP Source

Download connector CoAP Connector 1.2 for Kafka CoAP Connector 1.1 for Kafka

A Connector and Source to stream messages from a CoAP server and write them to a Kafka topic.

Prerequisites

  • Apache Kafka 0.11.x or above
  • Kafka Connect 0.11.x or above
  • Java 1.8

Features

  1. DTLS secure clients
  2. Observable resources
  3. The KCQL routing querying - Resource to topic mapping.

The Source Connector automatically converts the CoAP response into a Kafka Connect Struct to be store in Kafka as AVRO or JSON dependent on the Converters used in Connect. The schema can found here.

The key of the Struct message sent to Kafka is made from the source defined in the message, the resource on the CoAP server and the message id.

KCQL Support

INSERT INTO kafka_topic SELECT * FROM coap_resouce

Tip

You can specify multiple KCQL statements separated by ; to have a the connector sink multiple topics.

The CoAP source supports KCQL, Kafka Connect Query Language. The following support KCQL is available:

  1. Selection of source resources and target topics.

Kafka messages are stored as JSON strings in CoAP.

Example:

-- Insert mode, select all fields from resourceA and write to topicA
INSERT INTO topicA SELECT * FROM resourceA

This is set in the connect.coap.kcql option.

DTLS Secure connections

The Connector uses the Californium Java API and for secure connections use the Scandium security module provided by Californium. Scandium (Sc) is an implementation of Datagram Transport Layer Security 1.2, also known as RFC 6347.

Please refer to the Californium certification repo page for more information.

The connector supports:

  1. SSL trust and key stores
  2. Public/Private PEM keys and PSK client/identity
  3. PSK Client Identity

The Sink will attempt secure connections in the following order if the URI schema of connect.coap.uri set to secure, i.e.``coaps``. If connect.coap.username is set PSK client identity authentication is used, if additional connect.coap.private.key.path Public/Private keys authentication will also be attempted. Otherwise SSL trust and key store.

openssl pkcs8 -in privatekey.pem -topk8 -nocrypt -out privatekey-pkcs8.pem

Only cipher suites TLS_PSK_WITH_AES_128_CCM_8 and TLS_PSK_WITH_AES_128_CBC_SHA256 are currently supported.

Warning

The keystore, truststore, public and private files must be available on the local disk of the worker task.

Loading specific certificates can be achieved by providing a comma separated list for the connect.coap.certs configuration option. The certificate chain can be set by the connect.coap.cert.chain.key configuration option.

Lenses QuickStart

The easiest way to try out this is using Lenses Box the pre-configured docker, that comes with this connector pre-installed. You would need to Connectors –> New Connector –> Source –> CoAP and paste your configuration

../../_images/lenses-create-coap-source-connector.png

CoAP Setup

The connector uses Californium Java API under the hood. A resource is hosted at coap://californium.eclipse.org:5683. Copper, a FireFox browser add-on is available so you can browse the server and resources. You can view messages via the browser add-on by selecting the resource and clicking the observe button on the top menu.

Installing the Connector

Connect, in production should be run in distributed mode

  1. Install and configure a Kafka Connect cluster
  2. Create a folder on each server called plugins/lib
  3. Copy into the above folder the required connector jars from the stream reactor download
  4. Edit connect-avro-distributed.properties in the etc/schema-registry folder and uncomment the plugin.path option. Set it to the root directory i.e. plugins you deployed the stream reactor connector jars in step 2.
  5. Start Connect, bin/connect-distributed etc/schema-registry/connect-avro-distributed.properties

Connect Workers are long running processes so set an init.d or systemctl service accordingly.

Source Connector QuickStart

Start Kafka Connect in distributed mode (see install). In this mode a Rest Endpoint on port 8083 is exposed to accept connector configurations. We developed Command Line Interface to make interacting with the Connect Rest API easier. The CLI can be found in the Stream Reactor download under the bin folder. Alternatively the Jar can be pulled from our GitHub releases page.

Starting the Connector

Download, and install Stream Reactor. Follow the instructions here if you haven’t already done so. All paths in the quickstart are based on the location you installed the Stream Reactor.

Once the Connect has started we can now use the kafka-connect-tools cli to post in our distributed properties file for CoAP. For the CLI to work including when using the dockers you will have to set the following environment variable to point the Kafka Connect Rest API.

export KAFKA_CONNECT_REST="http://myserver:myport"
➜  bin/connect-cli create coap-source < conf/coap-source.properties

name=coap-source
connector.class=com.datamountaineer.streamreactor.connect.coap.source.CoapSourceConnector
tasks.max=1
connect.coap.uri=coap://californium.eclipse.org:5683
connect.coap.kcql= INSERT INTO coap-topic SELECT * FROM obs-pumping-non

The coap-source.properties file defines:

  1. The name of the source.
  2. The name number of tasks.
  3. The class containing the connector.
  4. The uri of the CoAP Server and port to connect to.
  5. The KCQL routing querying.. This specifies the target topic and source resource on the CoAP server.

We can use the CLI to check if the connector is up but you should be able to see this in logs as well.

#check for running connectors with the CLI
➜ bin/connect-cli ps
coap-source
  INFO
    __                    __
   / /   ____ _____  ____/ /___  ____  ____
  / /   / __ `/ __ \/ __  / __ \/ __ \/ __ \
 / /___/ /_/ / / / / /_/ / /_/ / /_/ / /_/ /
/_____/\__,_/_/ /_/\__,_/\____/\____/ .___/
                                   /_/
     ______                 _____
    / ____/___  ____ _____ / ___/____  __  _______________
   / /   / __ \/ __ `/ __ \\__ \/ __ \/ / / / ___/ ___/ _ \ By Andrew Stevenson
  / /___/ /_/ / /_/ / /_/ /__/ / /_/ / /_/ / /  / /__/  __/
  \____/\____/\__,_/ .___/____/\____/\__,_/_/   \___/\___/
                  /_/ (com.datamountaineer.streamreactor.connect.coap.source.CoapSourceTask:54)

Check for records in Kafka

Check for records in Kafka with the console consumer.

➜  bin/kafka-avro-console-consumer \
   --zookeeper localhost:2181 \
   --topic coap-topic \
   --from-beginning
{"message_id":{"int":4803},"type":{"string":"ACK"},"code":"4.04","raw_code":{"int":132},"rtt":{"long":35},"is_last":{"boolean":true},"is_notification":{"boolean":false},"source":{"string":"idvm-infk-mattern04.inf.ethz.ch:5683"},"destination":{"string":""},"timestamp":{"long":0},"token":{"string":"b24774e37c2314a4"},"is_duplicate":{"boolean":false},"is_confirmable":{"boolean":false},"is_rejected":{"boolean":false},"is_acknowledged":{"boolean":false},"is_canceled":{"boolean":false},"accept":{"int":-1},"block1":{"string":""},"block2":{"string":""},"content_format":{"int":-1},"etags":[],"location_path":{"string":""},"location_query":{"string":""},"max_age":{"long":60},"observe":null,"proxy_uri":null,"size_1":null,"size_2":null,"uri_host":null,"uri_port":null,"uri_path":{"string":""},"uri_query":{"string":""},"payload":{"string":""}}
{"message_id":{"int":4804},"type":{"string":"ACK"},"code":"4.04","raw_code":{"int":132},"rtt":{"long":34},"is_last":{"boolean":true},"is_notification":{"boolean":false},"source":{"string":"idvm-infk-mattern04.inf.ethz.ch:5683"},"destination":{"string":""},"timestamp":{"long":0},"token":{"string":"b24774e37c2314a4"},"is_duplicate":{"boolean":false},"is_confirmable":{"boolean":false},"is_rejected":{"boolean":false},"is_acknowledged":{"boolean":false},"is_canceled":{"boolean":false},"accept":{"int":-1},"block1":{"string":""},"block2":{"string":""},"content_format":{"int":-1},"etags":[],"location_path":{"string":""},"location_query":{"string":""},"max_age":{"long":60},"observe":null,"proxy_uri":null,"size_1":null,"size_2":null,"uri_host":null,"uri_port":null,"uri_path":{"string":""},"uri_query":{"string":""},"payload":{"string":""}}
{"message_id":{"int":4805},"type":{"string":"ACK"},"code":"4.04","raw_code":{"int":132},"rtt":{"long":35},"is_last":{"boolean":true},"is_notification":{"boolean":false},"source":{"string":"idvm-infk-mattern04.inf.ethz.ch:5683"},"destination":{"string":""},"timestamp":{"long":0},"token":{"string":"b24774e37c2314a4"},"is_duplicate":{"boolean":false},"is_confirmable":{"boolean":false},"is_rejected":{"boolean":false},"is_acknowledged":{"boolean":false},"is_canceled":{"boolean":false},"accept":{"int":-1},"block1":{"string":""},"block2":{"string":""},"content_format":{"int":-1},"etags":[],"location_path":{"string":""},"location_query":{"string":""},"max_age":{"long":60},"observe":null,"proxy_uri":null,"size_1":null,"size_2":null,"uri_host":null,"uri_port":null,"uri_path":{"string":""},"uri_query":{"string":""},"payload":{"string":""}}
{"message_id":{"int":4806},"type":{"string":"ACK"},"code":"4.04","raw_code":{"int":132},"rtt":{"long":35},"is_last":{"boolean":true},"is_notification":{"boolean":false},"source":{"string":"idvm-infk-mattern04.inf.ethz.ch:5683"},"destination":{"string":""},"timestamp":{"long":0},"token":{"string":"b24774e37c2314a4"},"is_duplicate":{"boolean":false},"is_confirmable":{"boolean":false},"is_rejected":{"boolean":false},"is_acknowledged":{"boolean":false},"is_canceled":{"boolean":false},"accept":{"int":-1},"block1":{"string":""},"block2":{"string":""},"content_format":{"int":-1},"etags":[],"location_path":{"string":""},"location_query":{"string":""},"max_age":{"long":60},"observe":null,"proxy_uri":null,"size_1":null,"size_2":null,"uri_host":null,"uri_port":null,"uri_path":{"string":""},"uri_query":{"string":""},"payload":{"string":""}}

Configurations

The Kafka Connect framework requires the following in addition to any connectors specific configurations:

Config Description Type Value
name Name of the connector string  
tasks.max The number of tasks to scale output int 1
connector.class Name of the connector class string com.datamountaineer.streamreactor.connect.coap.source.CoapSourceConnector

Connector Configurations

Config Description Type
connect.coap.uri Uri of the CoAP server string
connect.coap.kcql The KCQL statement to select and route resources to topics string

Optional Configurations

Config Description Type
connect.coap.port The port the DTLS connector will bind to on the Connector host. Default : 0 int
connect.coap.host The hostname the DTLS connector will bind to on the Connector host. Default : localhost string
connect.coap.username CoAP PSK identity string
connect.coap.password CoAP PSK secret password
connect.coap.public.key.file Path to the public key file for use in with PSK credentials string
connect.coap.keystore.path The path to the keystore string
connect.coap.keystore.pass The password of the key store. Default : rootPass string
connect.coap.truststore.path The path to the truststore string
connect.coap.truststore.pass The password of the trust store. Default : rootPass string
connect.coap.certs The certificates to load from the trust store list
connect.coap.cert.chain.key The key to use to get the certificate chain. Default ; client string
connect.coap.batch.size
The number of events to take from the internal queue to batch
together to send to Kafka. Default : 100
int
connect.progress.enabled Enables the output for how many records have been processed. Default : false boolean
connect.coap.private.key.file
Path to the private key file for use in with PSK credentials
in PKCS8 rather than PKCS1. Use open SSL to convert
openssl pkcs8 -in privatekey.pem -topk8 -nocrypt -out privatekey-pkcs8.pem
Only cipher suites TLS_PSK_WITH_AES_128_CCM_8 and TLS_PSK_WITH_AES_128_CBC_SHA256 are currently supported
string

Schema Evolution

The schema is fixed.

The following schema is used for the key:

Name Type
source Optional string
source_resource Optional String
message_id Optional int32

The following schema is used for the payload:

Name Type
message_id Optional int32
type Optional String
code Optional String
raw_code Optional int32
rtt Optional int64
is_last Optional boolean
is_notification Optional boolean
source Optional String
destination Optional String
timestamp Optional int64
token Optional String
is_duplicate Optional boolean
is_confirmable Optional boolean
is_rejected Optional boolean
is_acknowledged Optional boolean
is_canceled Optional boolean
accept Optional int32
block1 Optional String
block2 Optional String
content_format Optional int32
etags Array of Optional Strings
location_path Optional String
location_query Optional String
max_age Optional int64
observe Optional int32
proxy_uri Optional String
size_1 Optional String
size_2 Optional String
uri_host Optional String
uri_port Optional int32
uri_path Optional String
uri_query Optional String
payload Optional String

Kubernetes

Helm Charts are provided at our repo, add the repo to your Helm instance and install. We recommend using the Landscaper to manage Helm Values since typically each Connector instance has its own deployment.

Add the Helm charts to your Helm instance:

helm repo add landoop https://landoop.github.io/kafka-helm-charts/

TroubleShooting

Please review the FAQs and join our slack channel