RabbitMQ: persistent message with Topic exchange

Julien picture Julien · May 27, 2011 · Viewed 24.7k times · Source

I am very new to RabbitMQ.

I have set up a 'topic' exchange. The consumers may be started after the publisher. I'd like the consumers to be able to receive messages that have been sent before they were up, and that were not consumed yet.

The exchange is set up with the following parameters:

exchange_type => 'topic'
durable => 1
auto_delete => 0
passive => 0

The messages are published with this parameter:

delivery_mode => 2

Consumers use get() to retrieve the messages from the exchange.

Unfortunately, any message published before any client was up is lost. I have used different combinations.

I guess my problem is that the exchange does not hold messages. Maybe I need to have a queue between the publisher and the consumer. But this does not seem to work with a 'topic' exchange where messages are routed by a key.

How should I proceed? I use the Perl binding Net::RabbitMQ (shouldn't matter) and RabbitMQ 2.2.0.

Answer

Brian Kelly picture Brian Kelly · May 27, 2011

You need a durable queue to store messages if there are no connected consumers available to process the messages at the time they are published.

An exchange doesn't store messages, but a queue can. The confusing part is that exchanges can be marked as "durable" but all that really means is that the exchange itself will still be there if you restart your broker, but it does not mean that any messages sent to that exchange are automatically persisted.

Given that, here are two options:

  1. Perform an administrative step before you start your publishers to create the queue(s) yourself. You could use the web UI or the command line tools to do this. Make sure you create it as a durable queue so that it will store any messages that are routed to it even if there are no active consumers.
  2. Assuming your consumers are coded to always declare (and therefore auto-create) their exchanges and queues on startup (and that they declare them as durable), just run all your consumers at least once before starting any publishers. That will ensure that all your queues get created correctly. You can then shut down the consumers until they're really needed because the queues will persistently store any future messages routed to them.

I would go for #1. There may not be many steps to perform and you could always script the steps required so that they could be repeated. Plus if all your consumers are going to pull from the same single queue (rather than have a dedicated queue each) it's really a minimal piece of administrative overhead.

Queues are something to be managed and controlled properly. Otherwise you could end up with rogue consumers declaring durable queues, using them for a few minutes but never again. Soon after you'll have a permanently-growing queue with nothing reducing its size, and an impending broker apocalypse.