How does consumer rebalancing work in Kafka?

Depends on what you mean by “blocked”. If you mean “are existing connections closed when rebalance is triggered” then the answer is yes. The current Kafka’s rebalancing algorithm is unfortunately imperfect. Here is what is happening during consumer rebalance. Assume we have a topic with 10 partitions (0-9), and one consumer (lets name it consumer1) … Read more

Job queue as SQL table with multiple consumers (PostgreSQL)

I use postgres for a FIFO queue as well. I originally used ACCESS EXCLUSIVE, which yields correct results in high concurrency, but has the unfortunate effect of being mutually exclusive with pg_dump, which acquires a ACCESS SHARE lock during its execution. This causes my next() function to lock for a very long time (the duration … Read more

Blocking queue and multi-threaded consumer, how to know when to stop

You should continue to take() from the queue. You can use a poison pill to tell the worker to stop. For example: private final Object POISON_PILL = new Object(); @Override public void run() { //worker loop keeps taking en element from the queue as long as the producer is still running or as //long as … Read more

Is Zookeeper a must for Kafka? [closed]

Yes, Zookeeper is required for running Kafka. From the Kafka Getting Started documentation: Step 2: Start the server Kafka uses zookeeper so you need to first start a zookeeper server if you don’t already have one. You can use the convenience script packaged with kafka to get a quick-and-dirty single-node zookeeper instance. As to why, … Read more