How to install kafkacat on Fedora
Converting from AsciiDoc to Google Docs and MS Word
A quick and dirty way to monitor data arriving on Kafka
I’ve been poking around recently with capturing Wi-Fi packet data and streaming it into Apache Kafka, from where I’m processing and analysing it. Kafka itself is rock-solid - because I’m using ☁️Confluent Cloud and someone else worries about provisioning it, scaling it, and keeping it running for me. But whilst Kafka works just great, my side of the setup—tshark
running on a Raspberry Pi—is less than stable. For whatever reason it sometimes stalls and I have to restart the Raspberry Pi and restart the capture process.
Are Tech Conferences Dead?
Streaming Wi-Fi trace data from Raspberry Pi to Apache Kafka with Confluent Cloud
Kafka Connect JDBC Sink - setting the key field name
I wanted to get some data from a Kafka topic:
ksql> PRINT PERSON_STATS FROM BEGINNING;
Key format: KAFKA (STRING)
Value format: AVRO
rowtime: 2/25/20 1:12:51 PM UTC, key: robin, value: {"PERSON": "robin",
"LOCATION_CHANGES":1, "UNIQUE_LOCATIONS": 1}
into Postgres, so did the easy thing and used Kafka Connect with the JDBC Sink connector.
Adventures in the Cloud, Part 94: ECS
My name’s Robin, and I’m a Developer Advocate. What that means in part is that I build a ton of demos, and Docker Compose is my jam. I love using Docker Compose for the same reasons that many people do:
-
Spin up and tear down fully-functioning multi-component environments with ease. No bespoke builds, no cloning of VMs to preserve "that magic state where everything works"
-
Repeatability. It’s the same each time.
-
Portability. I can point someone at a
docker-compose.yml
that I’ve written and they can run the same on their machine with the same results almost guaranteed.
Primitive Keys in ksqlDB
ksqlDB 0.7 will add support for message keys as primitive data types beyond just STRING
(which is all we’ve had to date). That means that Kafka messages are going to be much easier to work with, and require less wrangling to get into the form in which you need them. Take an example of a database table that you’ve ingested into a Kafka topic, and want to join to a stream of events. Previously you’d have had to take the Kafka topic into which the table had been ingested and run a ksqlDB processor to re-key the messages such that ksqlDB could join on them. Friends, I am here to tell you that this is no longer needed!
Fantastical / Mac Calendar not showing Google Shared Calendar
Very simple to fix: go to https://calendar.google.com/calendar/syncselect and select the calendars that you want. Click save.
Notes on getting data into InfluxDB from Kafka with Kafka Connect
When a message from your source Kafka topic is written to InfluxDB the InfluxDB values are set thus:
-
Timestamp is taken from the Kafka message timestamp (which is either set by your producer, or the time at which it was received by the broker)
-
Tag(s) are taken from the
tags
field in the message. This field must be amap
type - see below -
Value fields are taken from the rest of the message, and must be numeric or boolean
-
Measurement name can be specified as a field of the message, or hardcoded in the connector config.
Kafka Connect and Schemas
Here’s a fun one that Kafka Connect can sometimes throw out:
java.lang.ClassCastException:
java.util.HashMap cannot be cast to org.apache.kafka.connect.data.Struct
HashMap? Struct? HUH?
Monitoring Sonos with ksqlDB, InfluxDB, and Grafana
UnsupportedClassVersionError: <x>
has been compiled by a more recent version of the Java Runtime
This article is just for Googlers and my future self encountering this error. Recently I was building a Docker image from the ksqlDB code base, and whilst it built successfully the ksqlDB server process in the Docker container when instantiated failed with a UnsupportedClassVersionError
: