# InsertRecordTimestampHeaders

The headers inserted are of type STRING. By using this SMT, you can partition the data by `yyyy-MM-dd/HH` or `yyyy/MM/dd/HH`, for example, and only use one SMT.

The list of headers inserted are:

* date
* year
* month
* day
* hour
* minute
* second

All headers can be prefixed with a custom prefix. For example, if the prefix is `wallclock_`, then the headers will be:

* wallclock\_date
* wallclock\_year
* wallclock\_month
* wallclock\_day
* wallclock\_hour
* wallclock\_minute
* wallclock\_second

When used with the Lenses connectors for S3, GCS or Azure data lake, the headers can be used to partition the data. Considering the headers have been prefixed by `_`, here are a few KCQL examples:

{% code fullWidth="false" %}

```sql
connect.s3.kcql=INSERT INTO $bucket:prefix SELECT * FROM kafka_topic PARTITIONBY _header._date, _header._hour
connect.s3.kcql=INSERT INTO $bucket:prefix SELECT * FROM kafka_topic PARTITIONBY _header._year, _header._month, _header._day, _header._hour
```

{% endcode %}

## Transform Type Class

```
io.lenses.connect.smt.header.InsertRecordTimestampHeaders
```

## Configuration

<table><thead><tr><th width="244">Name</th><th width="149">Description</th><th width="92">Type</th><th width="137">Default</th><th>Importance</th></tr></thead><tbody><tr><td><code>header.prefix.name</code></td><td>Optional header prefix.</td><td>String</td><td></td><td>Low</td></tr><tr><td><code>date.format</code></td><td>Optional Java date time formatter.</td><td>String</td><td>yyyy-MM-dd</td><td>Low</td></tr><tr><td><code>year.format</code></td><td>Optional Java date time formatter for the year component.</td><td>String</td><td>yyyy</td><td>Low</td></tr><tr><td><code>month.format</code></td><td>Optional Java date time formatter for the month component.</td><td>String</td><td>MM</td><td>Low</td></tr><tr><td><code>day.format</code></td><td>Optional Java date time formatter for the day component.</td><td>String</td><td>dd</td><td>Low</td></tr><tr><td><code>hour.format</code></td><td>Optional Java date time formatter for the hour component.</td><td>String</td><td>HH</td><td>Low</td></tr><tr><td><code>minute.format</code></td><td>Optional Java date time formatter for the minute component.</td><td>String</td><td>mm</td><td>Low</td></tr><tr><td><code>second.format</code></td><td>Optional Java date time formatter for the second component.</td><td>String</td><td>ss</td><td>Low</td></tr><tr><td><code>timezone</code></td><td>Optional. Sets the timezone. It can be any valid Java timezone.</td><td>String</td><td>UTC</td><td>Low</td></tr><tr><td><code>locale</code></td><td>Optional. Sets the locale. It can be any valid Java locale.</td><td>String</td><td>en</td><td>Low</td></tr></tbody></table>

## Example

To store the epoch value, use the following configuration:

```properties
transforms=InsertWallclock
transforms.InsertWallclock.type=io.lenses.connect.smt.header.InsertRecordTimestampHeaders
```

To prefix the headers with `wallclock_`, use the following:

```properties
transforms=InsertWallclock
transforms.InsertWallclock.type=io.lenses.connect.smt.header.InsertRecordTimestampHeaders
transforms.InsertWallclock.header.prefix.name=wallclock_
```

To change the date format, use the following:

```properties
transforms=InsertWallclock
transforms.InsertWallclock.type=io.lenses.connect.smt.header.InsertRecordTimestampHeaders
transforms.InsertWallclock.date.format=yyyy-MM-dd
```

To use the timezone `Asia/Kolkoata`, use the following:

```properties
transforms=InsertWallclock
transforms.InsertWallclock.type=io.lenses.connect.smt.header.InsertRecordTimestampHeaders
transforms.InsertWallclock.timezone=Asia/Kolkata
```

To facilitate S3, GCS, or Azure Data Lake partitioning using a Hive-like partition name format, such as `date=yyyy-MM-dd / hour=HH`, employ the following SMT configuration for a partition strategy.

```properties
transforms=InsertWallclock
transforms.InsertWallclock.type=io.lenses.connect.smt.header.InsertRecordTimestampHeaders    
transforms.InsertWallclock.date.format="date=yyyy-MM-dd"
transforms.InsertWallclock.hour.format="hour=yyyy"
```

and in the KCQL setting utilise the headers as partitioning keys:

{% code fullWidth="false" %}

```sql
connect.s3.kcql=INSERT INTO $bucket:prefix SELECT * FROM kafka_topic PARTITIONBY _header.date, _header.year
```

{% endcode %}

<br>
