A Kafka Connector source connector to subscribe to Bloomberg feeds and source via the Bloomberg labs open API data and write to Kafka.

Requires a Bloomberg BPIPE subscription.

KCQL support 

KCQL is not supported.


Launch the stack 

  1. Copy the docker-compose file.
  2. Bring up the stack.
docker-compose up -d fastdata

Start the connector 

If you are using Lenses, login into Lenses and navigate to the connectors page, select Bloomberg as the source and paste the following:


To start the connector without using Lenses, log into the fastdatadev container:

export CONNECTOR=bloomberg
docker exec -ti fastdata /bin/bash

and create a file containing the properties above.

Create the connector, with the connect-cli:

connect-cli create bloomberg <

Wait a for the connector to start and check its running:

connect-cli status bloomberg

Check for records in Kafka 

Check the records in Lenses or with via the console:

kafka-avro-console-consumer \
    --bootstrap-server localhost:9092 \
    --topic bloomberg \

Clean up 

Bring down the stack:

docker-compose down


NameDescriptionTypeDefault Value hostname running the bloomberg servicestring port on which the bloomberg service runs (8124-is the default)int Bloomberg service type: Market Data(//blp/mktdata);Message scrape(//blp/msgscrape)string parameter setting how the authentication should be done. It can be APPLICATION_ONLY or USER_AND_APPLICATION. Follow the Bloomberg API documentation for how to configure thisstring how big is the queue to hold the updates received from Bloomberg. If the buffer is fullit won’t accept new items until it is this service need authorization? true in normal case, false if running in simulatorExample: turefalseboolean
connect.progress.enabledEnables the output for how many records have been processedbooleanfalse the list of securities and the fields to subscribe to. Example: “AAPL US Equity:LAST_PRICE,BID,ASK;IBM US Equity:BID,ASK,HIGH,LOW,OPEN”string name of the kafka topic on which the data from Bloomberg will be sent.string the way the information is serialized and sent over kafka. There are two modes supported: json(default) and avro.string