IBM MQ on Docker - Channel was blocked
Running IBM MQ in a Docker container and the client connecting to it was throwing repeated Channel was blocked
errors.
Running IBM MQ in a Docker container and the client connecting to it was throwing repeated Channel was blocked
errors.
One of my favourite hacks for getting data into Kafka is using kafkacat and stdin
, often from jq
. You can see this in action with Wi-Fi data, IoT data, and data from a REST endpoint. This is fine for getting values into a Kafka message - but Kafka messages are key/value, and being able to specify a key is can often be important.
Here’s a way to do that, using a separator and some jq
magic. Note that at the moment kafkacat only supports single byte separator characters, so you need to choose carefully. If you pick a separator that also appears in your data, it’s possibly going to have unintended consequences.
Readers of a certain age and RDBMS background will probably remember northwind
, or HR
, or OE
databases - or quite possibly not just remember them but still be using them. Hardcoded sample data is fine, and it’s great for repeatable tutorials and examples - but it’s boring as heck if you want to build an example with something that isn’t using the same data set for the 100th time.
Here’s a collection of Kafka-related talks, just for you.
Each one has 🍿🎥 a recording, 📔 slides, and 👾 code to go and try out.
Kafka Connect is the integration API for Apache Kafka. Check out this video for an overview of what Kafka Connect enables you to do, and how to do it.
There’s ways, and then there’s ways, to count the number of records/events/messages in a Kafka topic. Most of them are potentially inaccurate, or inefficient, or both. Here’s one that falls into the potentially inefficient category, using kafkacat
to read all the messages and pipe to wc
which with the -l
will tell you how many lines there are, and since each message is a line, how many messages you have in the Kafka topic:
$ kafkacat -b broker:29092 -t mytestopic -C -e -q| wc -l
3
Google Chrome automagically adds sites that you visit which support searching to a list of custom search engines. For each one you can set a keyword which activates it, so based on the above list if I want to search Amazon I can just type a
<tab>
and then my search term
I had the pleasure of presenting at DataEngBytes recently, and am delighted to share with you the 🗒️ slides, 👾 code, and 🎥 recording of my ✨brand new talk✨:
A tiny snippet since I wasted 10 minutes going around the houses on this one…
tl;dr: If you try to create a command that is not in lower case (e.g. Alert
not alert
) then the setMyCommands
API will return BOT_COMMAND_INVALID
The server sends Transfer-Encoding: chunked
data, and you want to work with the data as you get it, instead of waiting for the server to finish, the EOF to fire, and then process the data?
When I set out to learn Go one of the aims I had in mind was to write a version of this little Python utility which accompanies a blog I wrote recently about understanding and diagnosing problems with Kafka advertised listeners. Having successfully got Producer, Consumer, and AdminClient API examples working, it is now time to turn to that task.
Having written my first Kafka producer in Go, and even added error handling to it, the next step was to write a consumer. It follows closely the pattern of Producer code I finished up with previously, using the channel-based approach for the Consumer:
I looked last time at the very bare basics of writing a Kafka producer using Go. It worked, but only with everything lined up and pointing the right way. There was no error handling of any sorts. Let’s see about fixing this now.