CSV

The engine has been extended to support them and allow users to browse the data as well as do stream processing on top of comma separated payloads. Using the CSV decoder is as simple as writing the following:

SELECT *
FROM  `temperature_values`

It is expected the payload is a comma separated list of values. The payload resulted is a logical structure as if you had the following json:

{
    "content":[
        value1,
        value2,
        ..
        valueN
    ]
}

Stream processing on CSV payloads is now possible through LSQL. Here is an example of joining two topics containing CSV data:

 -- LEFT JOIN two topics with CSV data
INSERT INTO target_topic

 SELECT STREAM
     cast(o.content[0] as long) as orderNumber,
     o.content[4] as status,
     SUM(cast(od.content[2] as int) * cast(od.content[3] as decimal)) total
 FROM  orders AS o LEFT JOIN orders_details AS od ON o.content[0] = od.content[0]
 GROUP BY tumble(2,s), o.content[0]

CSV rules

With CSV there is no type inferred for a field. Therefore all fields are considered to be STRING values. Using the CAST function the field can be marshaled to a different type for example INT,LONG,DOUBLE or DECIMAL. To be able to reference the values in the incoming payload, the user is expected to provide the index in the incoming data. The index is always 0 bound; this means to retrieve the first element contet[0] has to be provided. Making the data accessible is requiring the presence of the content keyword.

The next LSQL code computes the total value for a given order entry.

SELECT CAST(content[1] as decimal)  * CAST ( content[2] as INT) as total
FROM  `order_item`