Wednesday, June 30, 2010

The six thousand topic man: hosting many topics in the same ActiveMQ

An ActiveMQ user was enquiring about whether ActiveMQ (get it from fusesource.com!) could handle a publish-subscribe messaging architecture with six thousand topics. I've often seen production deployments of FUSE support tens to hundreds of JMS destinations; however, I wasn't quite sure how it would perform with a huge number of topics. Of course, you could reduce your number of topics by introducing message selectors on a smaller number of topics: but that avoids the question rather than answering it up front.

Throwing some questions at the FUSE engineering team got back a lot of confidence that it would indeed work just fine. Still though, I always like to try things and out and see for myself. So, I slapped together a JMS client that wrote 1,000,000 non-persisted messages to 6,000 JMS topics. Then, I put together another JMS client with 6000 consumers, with appropriate session and connection pooling in place. The result? Alarmingly straightforward: it worked just fine! While quietly content with this outcome, it's worth mentioning some background things I did on the Broker...

First, I needed to switch off the default 'thread-per-consumer' model in ActiveMQ. This is done by setting the JVM system variable -Dorg.apache.activemq.UseDedicatedTaskRunner=false - by default, this is set to 'true' in the ./bin/activemq[.bat] startup script. I tested what happens if I leave this to 'true' and indeed I ended up with 6,000 threads in the broker. Ouch. When you disable this setting, the consumers are served from a pool of threads, and total thread count never got above sixty threads.

Next, I configured the Broker's transport connector to use 'nio:' rather than 'tcp:': this means we get a cleaner, more scalable threading model within the broker.

And so, it all works just fine. Dejan Bosonac's article on Python messaging: ActiveMQ and RabbitMQ suggests that you can get up to as much as 32,000 JMS destinations on a single broker; that's good to know, but I can't think of a situation where I'd need that right now.

No comments: