Archive for June, 2010

26
Jun
2010

A simple IRepository implementation for MongoDB and NoRM

by Stuart Harris

Neon Code by maistora David’s at the World Cup in South Africa, Cain’s been to Sonar in Barcelona and I’ve been coding!  Mostly around OpenId and OAuth, Raven DB, MongoDB and NoRM.  I reckon I’ve had just as much fun!

I absolutely love NoRM, it makes working with Mongo (which is a brilliant document database) into a dream.  Andrew Theken and the team are doing a fantastic job with it.  And they are extremely responsive – a small change on my fork was integrated in just a few hours, really bringing home the agility of Open Source Software.

So I needed a simple IRepository<T> implementation to make my application unaware that it was working with Mongo – like getting data from down a tunnel where I don’t care what’s at the other end.  And, of course, I want to make my life even easier.  This is what I came up with.

Unlike with Raven there really isn’t a concept of Unit of Work (or transactions) with Mongo.  Documents are updated atomically, though, which takes away half the situations where you would normally need transactions.  This means that the Repository pattern is easier to implement – its sessions don’t need to represent a UoW that is tied to the lifetime of the web request.  We don’t need to track instances or do change tracking in the session.  And NoRM also provides connection pooling so it’s perfectly reasonable to grab a Mongo connection every time you need to do something.

That’s great for atomic methods like Save or Remove that don’t return anything, or that return a single entity (like FindOne).  But it’s a problem for methods that return an IEnumerable<T> or IQueryable<T> because by the time the query actually instigates a connection to the database the session is long gone – and it’s not a good idea to “materialize” these into an array or list in order to return data from the method because then all the flexibility of LINQ is lost.

So there seems no alternative but to pass the session out to client code so that the user can control when it is released (after the query has been used).  Passing it out as IDisposable is great, though, so that consumers of the repository don’t need to know that the implementation is using Mongo.  And other Repository implementations can be slotted in instead.

I wanted to be able to use the repository in two ways.  Firstly for atomic operations I wanted to be able to call the relevant method on my repository instance like this (note that an instance of the repository would normally be injected into the constructor rather than calling the IoC container directly):

var repository = ObjectFactory.GetInstance<IRepository<User>>();
User user = repository.FindOne(userId);

And, secondly, for methods that return an enumerable I need to be able to materialize the query before disposing of the session:

var repository = ObjectFactory.GetInstance<IRepository<Role>>();
using (repository.NewSession())
{
    return repository.AsQueryable().Select(role => role.Name).ToArray();
}

I decided to use NoRM’s HiLo Id generator for my entities so that URLs are more friendly (Mongo’s ObjectId and Guids are not human – although I understand how they can be useful sometimes).  NoRM sees that I have a property called Id, and because it is Nullable<int>, NoRM uses the HiLo algorithm for me by default.  Objects have a null Id until they are saved by NoRM:

namespace RedBadger.Data
{
    public interface IEntity
    {
        int? Id { get; set; }
    }
}

So at the top of the Repository hierarchy is IRepository which simply exposes a method for creating a new session (for use by clients in their using blocks):

namespace RedBadger.Data
{
    using System;

    public interface IRepository
    {
        IDisposable NewSession();
    }
}

Then, from that, is derived an interface that adds familiar methods for interacting with the data:

namespace RedBadger.Data
{
    using System;
    using System.Collections.Generic;
    using System.Diagnostics.Contracts;
    using System.Linq;

    [ContractClass(typeof(IRepositoryContracts<>))]
    public interface IRepository<TEntity> : IRepository
        where TEntity : class, IEntity
    {
        IQueryable<TEntity> AsQueryable();

        void Drop();

        IEnumerable<TEntity> Find(Func<TEntity, bool> selector);

        TEntity FindOne(int? id);

        TEntity FindOne(Func<TEntity, bool> selector);

        void Insert(TEntity instance);

        void Remove(int? id);

        void Remove(Func<TEntity, bool> selector);

        void Save(TEntity instance);
    }
}

And the contracts for this interface are in a supporting class:

namespace RedBadger.Data
{
    using System;
    using System.Collections.Generic;
    using System.Diagnostics.Contracts;
    using System.Linq;

    [ContractClassFor(typeof(IRepository<>))]
    public abstract class IRepositoryContracts<T> : IRepository<T>
        where T : class, IEntity
    {
        public IDisposable NewSession()
        {
            Contract.Ensures(Contract.Result<IDisposable>() != null);
            return default(IDisposable);
        }

        public IQueryable<T> AsQueryable()
        {
            return default(IQueryable<T>);
        }

        public void Drop()
        {
        }

        public IEnumerable<T> Find(Func<T, bool> selector)
        {
            Contract.Requires(selector != null);
            return default(IEnumerable<T>);
        }

        public T FindOne(int? id)
        {
            Contract.Requires(id.HasValue);
            return default(T);
        }

        public T FindOne(Func<T, bool> selector)
        {
            Contract.Requires(selector != null);
            return default(T);
        }

        public void Insert(T instance)
        {
            Contract.Requires(instance != null);
            Contract.Requires(instance.Id == null);
        }

        public void Remove(int? id)
        {
            Contract.Requires(id.HasValue);
        }

        public void Remove(Func<T, bool> selector)
        {
            Contract.Requires(selector != null);
        }

        public void Save(T instance)
        {
            Contract.Requires(instance != null);
        }
    }
}

Now to the implementation.  Note that all the atomic methods (like Save) create their own session and dispose of it as soon as it’s finished with.  This allows the connection to return to the pool for re-use.  However, the methods that return an IEnumerable (like Find) simply ensure that the user has set up an external session and then use that.  Unfortunately, it’s not really possible to make sure that the user creates and disposes of these sessions in a responsible way.

namespace RedBadger.Data.Mongo
{
    using System;
    using System.Collections.Generic;
    using System.Diagnostics.Contracts;
    using System.Linq;

    using Norm;

    public class MongoRepository<TEntity> : IRepository<TEntity>
        where TEntity : class, IEntity
    {
        private readonly MongoConnectionStringBuilder mongoConnectionStringBuilder;

        private Mongo externalSession;

        public MongoRepository(MongoConnectionStringBuilder mongoConnectionStringBuilder)
        {
            this.mongoConnectionStringBuilder = mongoConnectionStringBuilder;
        }

        public IDisposable NewSession()
        {
            return this.externalSession = this.CreateSession();
        }

        public IQueryable<TEntity> AsQueryable()
        {
            this.EnsureSession();

            return this.externalSession.GetCollection<TEntity>().AsQueryable();
        }

        public void Drop()
        {
            try
            {
                using (Mongo session = this.CreateSession())
                {
                    session.Database.DropCollection(typeof(TEntity).Name);
                }
            }
            catch (MongoException e)
            {
                if (e.Message != "ns not found")
                {
                    throw;
                }
            }
        }

        public IEnumerable<TEntity> Find(Func<TEntity, bool> selector)
        {
            this.EnsureSession();

            return Find(this.externalSession, selector);
        }

        public TEntity FindOne(int? id)
        {
            return this.FindOne(x => x.Id == id);
        }

        public TEntity FindOne(Func<TEntity, bool> selector)
        {
            using (Mongo session = this.CreateSession())
            {
                return Find(session, selector).FirstOrDefault();
            }
        }

        public void Insert(TEntity instance)
        {
            using (Mongo session = this.CreateSession())
            {
                session.GetCollection<TEntity>().Insert(instance);
            }
        }

        public void Remove(int? id)
        {
            this.Remove(i => i.Id == id);
        }

        public void Remove(Func<TEntity, bool> selector)
        {
            using (Mongo session = this.CreateSession())
            {
                session.GetCollection<TEntity>().Delete(this.Find(selector));
            }
        }

        public void Save(TEntity instance)
        {
            using (Mongo session = this.CreateSession())
            {
                session.GetCollection<TEntity>().Save(instance);
            }
        }

        private static IEnumerable<TEntity> Find(Mongo session, Func<TEntity, bool> selector)
        {
            Contract.Requires(selector != null);

            IQueryable<TEntity> queryable = session.GetCollection<TEntity>().AsQueryable();
            Contract.Assume(queryable != null);

            return queryable.Where(selector);
        }

        private Mongo CreateSession()
        {
            return Mongo.Create(this.mongoConnectionStringBuilder.ToString());
        }

        private void EnsureSession()
        {
            if (this.externalSession == null)
            {
                throw new InvalidOperationException(
                    "This Repository has no current session.  Create one like this: using(repository.NewSession()) { ... }");
            }
        }
    }
}

I love the way that NoRM uses a URL for the connection string which means that I can use a class derived from UrlBuilder to set defaults for me:

namespace RedBadger.Data.Mongo
{
    using System;

    public class MongoConnectionStringBuilder : UriBuilder
    {
        public MongoConnectionStringBuilder()
            : base("mongodb", "localhost", 27017)
        {
        }

        public string Database
        {
            get
            {
                return this.Path;
            }

            set
            {
                this.Path = value;
            }
        }
    }
}

Finally, wiring it all up using StructureMap is really easy:

ObjectFactory.Configure(
    x =>
        {
            x.For<MongoConnectionStringBuilder>().Singleton().Use(
                () => new MongoConnectionStringBuilder { Database = "MyTestDB" });

            x.For(typeof(IRepository<>)).Use(typeof(MongoRepository<>));
        });

Incidentally, with StructureMap, you can also add more specialised repositories to the container.  At point of use you don’t need to know whether you’re getting a specialised repository or a more generic one, which is really cool:

ObjectFactory.Configure(
    x =>
        {
            x.For<MongoConnectionStringBuilder>().Singleton().Use(
                () => new MongoConnectionStringBuilder { Database = "MyTestDB" });

            x.For(typeof(IRepository<>)).Use(typeof(MongoRepository<>));
            x.For<IRepository<User>>().Use<UserRepository>();
        });

So that’s all straight forward and totally abstracts away the database driver implementation (the specialised UserRepository could easily be an implementation over another data store).  Anyway, enjoy.

8
Jun
2010

Ethical Consulting: a Red Badger principle

by Cain Ullah

Now that Red Badger is up and running, I thought I would write my first blog on one of the underlying principles of Red Badger. Ethical Consulting.

If you do a search on Google for Ethical Consulting you will get in your top 3 search results, a change management company, a company called ethical consulting and some guiding principles of ethical consulting defining what ethical behaviour is. When I mentioned ethical consulting to my friends and family, their first instincts were that Red Badger only delivered software for International Development programs, charities or green sustainability!

Ethical Consulting can mean different things to different people dependent on the context. Red Badger did not define the term but to us it has a specific meaning.

In my experience of running projects, if the relationship is ever fraught with the client, the majority of the time it is not normally down to the incompetence of the project team or in fact the client. However, projects can run over time, over budget and a blame culture ensues. To the project team, the client can appear unreasonable because they want everything and they want it yesterday. The client on the other hand feels they are not getting value for money. In this situation pressure builds, relationships can become difficult, morale is affected and as a result, so is productivity.

If you look at the underlying reasons for this, it should be fairly easy to resolve. The issues outlined above are, more often than not, a symptom of behaviour that started at the very beginning of the project lifecycle. Sales. Most companies in this business are driven by margin. The internal sales process is masked from the client and Sales staff are encouraged to up the rates of each resource to as high as they can get away with. They are pushed to be aggressive in this approach by being given large commission incentives on the size of the sale and not on the final margin. This results in sales (as large as possible as well please!) being driven by personal gain of the sales force to make as much money as possible for themselves as well as the company they work for. Once the project team tries to deliver, even with the aid of lean methodologies, it is not surprising that the client is ready to play hardball.

Now to Red Badger’s approach…

Our intention is to be transparent and fair from the start. We want our clients to feel like they are going to get value for money before we have delivered anything. We want to make a profit too, don’t get me wrong, but we want relationships that are built on a mutual understanding of what both our clients and us want. If we can work together from the beginning so that we are already working as a collaborative team built on fairness by the time it gets to delivery, it will benefit everyone. We can then leverage Scrum to empower the client in deciding what gets delivered, when. The client gets great value for money, the relationship is built on honesty, the working environment is infinitely better, morale is high and productivity and quality is increased as a result. This is our idea of the meaning of Ethical Consulting.

Great software is delivered. A solid relationship is built.

7
Jun
2010

ReWork: A great way to spend a fiver

by Stuart Harris

ReWork I’ve just read "ReWork: Change the Way You Work Forever” by Jason Fried and David Heinemeier Hansson of 37 Signals.  It cost £5.49.  Not bad for a refreshing confirmation of our philosophy at Red Badger.  And lots of new insights and succinct rationale for doing business in a radically different way.  It takes about 3 hours to read – an investment that will pay out over and over.

Some of the ideas I found most interesting in the book are around building a business that is open.  Speaking, blogging, tweeting, writing, teaching, making videos.  About everything you do.  A behind-the scenes documentary, if you like.  Lots of progressive companies do this now and it’s incredible, today, how easily the Internet enables us to do this.

We’ve talked about this a lot as a team and are committed to sharing all our experiences with the community – good and bad, warts and all.  This “open book” policy fit’s well with our belief that honesty is the best way to do business.  Everyone wins.  A blog, for example, is a great place to talk about ideas and to crystallise them in your own head.  It forces you to research an idea thoroughly and to fully understand everything about it.  Building a community by giving away product, IP and knowledge is also a win-win, because it builds strong brand loyalty and a ready made channel for super effective marketing.

ReWork introduced me to the Japanese concept of wabi-sabi which describes an aesthetic that is derived from the characteristics of being "imperfect, impermanent, and incomplete".  In essence it’s the cracks and scratches that show you what’s really inside and allows the observer to appreciate the object for what it really is.  Beauty, derived from honesty. And openness. The lines in our faces show the world how we have laughed and cried. Shoot me if I ever have botox!

I recently enjoyed working for 2 years at Conchango which was known for it’s early adoption of Agile principles.  The name Conchango is derived from “continuous change” which embodies the wabi-sabi values of impermanence and incompleteness.  It was a real company, with an honest surface that reflected it’s core.  Small companies can do that well.  When it was taken over by a large multinational its wabi-sabi was lost.  It’s much, much harder for big companies to be agile, open and honest.

Papering over the cracks is the way that most businesses today operate.  They create a facade that is sterile, plastic and opaque – it doesn’t let the real company through.  The authors of ReWork remind us of how many times we have waited in a call queue and been told (by a machine) that we are important to them.  And yet they are willing to let us waste half an hour of our life waiting in a queue.  Just be honest with us, tell us you are under-resourced and call us back.  We would rather see the crack.  Then we can see your beauty, trust you and enjoy doing business.

6
Jun
2010

Cloud Folder Backup: A Rackspace Cloud Files Experiment

by David Wynne

Recently I needed to put in place a temporary (and simple) backup solution on a remote server.  I just needed to ensure that a single folder on the machine got backed up to an external location on a daily basis.

I’d read about Rackspace’s new[-ish] Cloud Files service (previously Mosso) which is similar to Amazon S3 and comes with a RESTful API.  Rackspace have been developing a bunch of language specific bindings over the top of this API, making it easier for developers to develop against it with their preferred language.

The source code for each binding is open source and the C# .Net binding is hosted on GitHub – so I thought I’d set about writing something über simple to accomplish the task and use Cloud Files as the backing store in order to check it out.

The result is Cloud Folder Backup, a little utility app driven simply by it’s configuration.  It uses DotNetZip Library to create an in-memory zip of the source folder you want to backup and then uses the Rackspace C# .Net binding to put the stream into a predefined container within Cloud Files.

I intended to use Windows Task Scheduler to run it daily, so wrote a simple console app that by default provides a little feedback, but can be run in silent mode (for use with Windows Task Scheduler et al).

Basic logging is provided by log4net, so you can keep an eye on how often it’s run and/or any exceptions that may have been thrown.

Specs are written in MSpec and supported by Rhino.Mocks

It’s a painfully simple little app – but does exactly what I need it to do.  All the code and documentation is up on GitHub, if you want to take a look/use it: http://github.com/redbadger/CloudFolderBackup

dwynne