One of the really nice features of Event Store is the Persistent Subscriptions that were implemented in v3.2.0. I was previously using catch-up subscriptions but needed the ability to have many worker processes handle events from the same stream.
First lets take a look at couple of the subscription models Event Store suports.
Catch Up Subscriptions
As mentioned, I previously would use catch-up subscriptions for various use cases. One of them would be for sending emails on specific events occurring in the system.
A worker process would subscribe to the $all event stream and handle incoming messages accordingly.
The issue is that catch up subscriptions are controlled by the client. The start position of the subscription is stored by the client. This information must be maintained by the client.
This means I did not have a built-in way to have multiple worker processes handle events from the same stream but only processing each event only once.
Notice I wrote built-in above. Yes I could of created shared datasource that each worker process would share in order to determine which messages where being handled by which process.
I didn’t want to go down this road.
The next option would be to have multiple worker processes but each worker process would only be handling a single or small subset of events.
They differ from catch-up subscription as the position of which event to publish is kept on the server rather than the client.
If you have used RabbitMQ or ActiveMQ, the Persistent Subscriptions in Event Store will feel similar.
I think the title Competing Consumers really does illustrate how it works.
You can have many work processes (Consumers) but only one of them will receive an event from the server.
Lets say we have 3 worker processes which are all subscribing to the same subscription. When an event is published from the server, only one worker process will receive that specific event. Not all 3.
In the catch-up subscription model, all 3 would receive the same event.
What this allowed me to do in my email example, is have multiple worker processes handle specific events. And if I wanted to create multiple subscriptions for single or small subset of events, as I could with catch-up, I can do the same with persistent subscriptions but now have multiple worker processes as well.
When you create a persistent subscription, you will define a group name and the event stream you want to subscribe to.
When your client (consumers) want to use this subscription, they specify the same group name and the stream.
The server will dispatch only one event to a consumer at any given time. The event at this point is marked as in-process. The client will acknowledge it has processed the message.
I’ve rewritten the above thanks to Greg Young’s correction in the comments.
The server will dispatch N events to the consumer at any given time. The number of events is configurable when connecting to the subscription as you can define a buffer size which is the number of in-flight messages the client is allowed.
It can also not acknowledge the message, or if a timeout period expires without either a retry can occur.
I’m in the middle of putting together a demo of using persistent connections in action. If this is something you have been looking for, let me know as I would love to discuss more about some of the use cases and a few of the quirks I have discovered while starting to dive into the .NET API.