ORMBattle.NETThe ORM tool shootout

  • Increase font size
  • Default font size
  • Decrease font size
Home About ORMBattle.NET

About ORMBattle.NET

E-mail Print PDF

ORMBattle.Net is free ORM tool comparison and benchmarking project.

The goal of this web site is to provide honest comparison of essential features of different ORM tools for .NET, and moreover, answer the question: "Why it works this way and how to make it work better?".

There are no advertising slogans and banners, no salespeople, just equitable comparison, benchmarking and analysis.

Who are you?

Currently our team consists of developers from X-tensive.com:

  • Alexis Kochetov (the author of test suite)
  • Alex Kofman (author of this web site)
  • Dmitry Maximov (author of site name and co-author of the project idea)
  • Alex Yakunin (co-author of test suite, author of this web site)

We are experts in ORM tools for .NET, relational databases and related technologies. Moreover, our company has a product competing in this niche (DataObjects.Net), so we know what must be tested.

"Currently" - because this web site is just launched. Anyone can join the project - we're glad to get any contributions to it.

Why did you decide to start the project?

Here is the full story: in July 2009 we've been polishing performance of various parts of DataObjects.Net, and thus created a set of really hard tests for it (later they were converted to Performance tests for all the other participants here). "Really hard" means that tests we created were designed to expose the worst overhead our ORM might produce relatively to direct usage of SqlClient. In fact, we've been measuring performance of basic operations on tiny entities - normally any ORM brings huge per-entity versus relatively small per-field overhead; for DataObjects.Net with its separated entity state this is even more important.

So we've created a set of CRUD tests where DataObjects.Net might fail in comparison to others.

Later, in the beginning of August, Alexis Kochetov has started to work on his upcoming talk for .NET User Group: "LINQ implementation in DataObjects.Net". He knew lots of details related to LINQ implementation at that moment, because he was the main developer of LINQ translator for DataObjects.Net. But he almost immediately discovered that planned talk content won't be interesting for the people there while discussing it with us: it's simply too technical. On the other hand, the talk was already planned. So we decided to forward it to more practical way: it would be much more interesting for the people there to know how complete is LINQ support in ORM tools available for .NET.

That's how our LINQ implementation tests sequence has appeared. The idea to add performance tests to these comparison appeared at the same moment. Both these sequences could provide a some impression of essential features.

The idea to create this web site actually appeared after we've got our first results: they were quite looking really impressive for DataObjects.Net. We knew we've been keeping the performance in mind while developing it, and already knew how it's compared to ADO.NET Entity Framework, so it was nice to see the proof of this fact on practice.

So the original idea was to create a web site exposing this - set up a kind of "landmark" for ORM performance. Yes, originally it was purely promotional idea ;) On the other hand, we clearly understood such a site might be really interesting only if it will be fully honest. And it could be really interesting: we noticed that it is almost impossible to find a good ORM comparison or benchmarks on the web. All we could find (btw, they are listed here) was actually not what we wanted to get, e.g. we've seen:

  • Feature-oriented comparisons - we explain why they are much less attractive
  • Benchmarks involving rather complex operations & queries - so they don't give a clear imagination of costs and expenses related to ORM itself. E.g. if queries there are rather complex, their translation to SQL can be so different that different query plans might be used for them, and thus the comparison would be mainly showing the differences in quality of query translation.

If all above is true, the only option left for the people choosing ORM tool is manual testing - and it looks like this is really the path chosen by many developers. But it takes a lot of time to test all available frameworks!

Finally, let's imagine you already chosen and ORM tool, and implemented some solution on it. But do you know if it performs well enough? Can it be optimized further or it is already getting the best results it can get? How fast it should be? Should it perform 100, 1000, or 1M operations per second at peak? This web site helps to answer all these questions.

Quick summary:

This web site appeared, because we:

  • Have been studying & comparing quality & performance of our own framework for our internal needs
  • Understood there is a lack of information related to comparison of ORM frameworks, so we adopted our internal tests for a set of other ORM frameworks
So we decided to share our own tests. And if you need to compare the efficiency of your own solution on a particular ORM to its limits, or choose an ORM tool, this resource might be helpful for you.

P.S. Why we're writing all this stuff, that may look even a bit compromising? To be honest.

Is the comparison really fully honest?

We hope so. At least, we did all we could to make this true. Obviously, such a resource is attractive only if it is fully honest.

Than why "we hope so" instead of "yes"? That's because software is complex, and we're only humans. And although we've tried to do our best, we could make a mistake. Moreover, while we know well some ORM tools and have very good background related to ORM tools and databases, we could fail with some particular ORM - simply because of some unusual feature in its design.

On the other hand, we did a lot to allow you to find and fix such a mistake:

  • We shared the complete source code of our test suite.
  • We created discussion forum here. They are fully open, although we'll remove spam.
  • Most likely, we'll allow article commenting later. Currently this does not work just because of lack of time.

So we hope it's easy to identify and eliminate any lacks there.

How frequently current results are updated?

For now they've been updated just once - on this web site launch. But we're planning to do this on monthly basis.

Moreover, shortly you can expect some analytics related to the results.

How can I help you? I'd like to participate in the project.

First of all you can download and study the source code of our test projects. We would be very grateful to you, if you confirm the correctness of tests there, or suggest how to make test results more credible.

If you find your favorite ORM is not represented in our tests, you can implement the test suite for it. We'll intergrate your test to our project and publish the results during the next test iteration.

Please make any remarks and proposals in ORMBattle.NET forum. If there is something personal, please write to This e-mail address is being protected from spambots. You need JavaScript enabled to view it .

Last Updated on Monday, 17 August 2009 19:28  

Polls

Which test must we add next?
 

Subscribe to our blog


mother of bride dress