Couchbase Linq Provider

Couchbase

I recently decided to use Couchbase for a personal side project.  The primary reason I chose to use it was because I hadn’t yet.  Simple as that.

Again, this is for a personal side project where I try out different technologies.  The second reason was I was interested in N1QL (pronounced “nickel”) which is the Couchbase Server query language.

I’m not covering installing the Couchbase server.  if you don’t have a Couchbase server installed, check out the offical docs.

Couchbase .NET SDK

The most obvious route for accessing a Couchbase server is by using the the official Couchbase .NET Client.  Looking at the docs, its very simple to store and retrieve a documents.  However, one of the features I was most interested was if it had a Linq provider.  It does not.  Bummer.

Linq2Couchbase

Thankfully I discovered Lin2Couchbase by Couchbase Labs which is a Linq provider to produce N1QL using the official Couchbase .NET SDK.

Grab the NuGet Package

Install-Package Linq2Couchbase

Owin/Katana

My side project is a web app that is self host using Katana and NancyFX.  In order to initialize the Couchbase client, you can use the ClusterHelper in your Startup class.  The reason it is done here is that the it is only required be initialized once.  The cluster helper will create a Cluster object which will later be used to create a BucketContext.

BucketContext

If you are familiar with Entity Framework, the BucketContext from Linq2Couchbase is similar to a DbContext.  It provides an API for querying a Couchbase bucket.

You can now inject IBucketContext into your Nancy Modules or Controllers if using ASP.NET MVC/WebAPI.

Registration

Whichever DI container you are using, you will want to register IBucketContext.  I’m using Unity and my registration as follows.

 POCO

For this example, I’ve created a simple User class.  One important thing to note is that you must specify which property in your class is the key.  You can do so by adding the KeyAttribute.  Also, the Key must be a string.

Save

Adding a new object is pretty straight forward.   Just pass your new object the Save() method on the IBucketContext.

Querying

Using the BucketContext you can query your bucket which is also very straight forward.

Note: If you have just installed Couchbase, you must create a primary index on the bucket you are querying.  Refer to the documentation on how to do so.  This stumbled me a bit at the beginning.  Thankfully the Brant Burnett was kind enough to help me out.

Documentation

The GitHub page for Linq2Couchbase has a ton of goodness.  I highly recommend checking it out if you are interested in couchbase or are looking for linq support.

As of this post, the project looks very active (just recently adding async support) and hopefully I can contribute back.


Event Stream as a Message Queue

I was recently having a discussion around a system being built using Microsoft Azure.  Some concepts being discussed for this system where CQRS, Event Sourcing and Message Queue.

The diagram below is fairly typical when discussing CQRS and Event Sourcing.

Message Queue

One of the first things that stood out to me was the use of the Message Queue and Azure Service Bus.

For this blog post, I want to focus on the Service bus, which is used for publish-subscribe pattern.  The domain will emit events that are stored to the event stream and then will be published to the Service Bus.  Subscribers, such as Projections or other Bounded Contexts will process these events.

Forgotten Option

There is nothing wrong with using a service bus to publish domain events.

However, one option which is seemingly always forgotten to developers that are new to CQRS and Event Sourcing:

Your event stream can be used as a message queue

Other bounded contexts or projections can query/poll your event storage to retrieve new events that have been persisted.

At regular intervals, the event consumer could poll your event storage requesting any number of new events based on the last event it processed.

The consumer would be required to keep track of the last event it processed in order.  This provides some benefits as it may not be a process that is required to be continuously running.

You may be thinking: Polling? Really?

If you rolled your own event storage, I could understand how this might be problematic and would likely be easier to use a service bus.  Or you may want your event consumers to process the event as soon as possible.  Your implementation of how to handle this is dependant on how you are currently storing your events.

But the point still remains: Your event stream is a message queue.

As always, context is king.  Requirements and many other factors will play into how you want to handle messaging.

It is another option that may fit a scenario that you run into.

Event Store

Event Store

If you are just thinking about getting into Event Sourcing, I would highly recommend looking at Event Store by Greg Young as your event storage.

Event Store supports multiple types of Subscriptions including Persistent Subscriptions for the Competing Consumers messaging pattern.

SQL Server Transaction Log File (LDF) Misconception

A common misconception is that setting the recovery model to simple will cause SQL Server not to use the transaction log file (LDF), preventing it from growing to an abnormal size.

In fact, the simple recovery model will still use the transaction log when performing transactions, however it will reclaims log space to keep space requirements small.

The actual file size will not be reduced by reclaiming space.  Same as performing a transactional backup will not reduce the actual file size.

This is important becase if you have a long running transaction that is performing many insert/update/delete statements, the transaction log file could grow (depending on settings) to a very large size.  Once it has grown, it will stay that size until you shrink the log file.