Home > Developing Using CodeFluent Entities > CodeFluent Entities: Optimizing loading performances

CodeFluent Entities: Optimizing loading performances

In yesterday’s post “CodeFluent Entities Performance Comparison”, we saw that CodeFluent Entities performed better than ADO.NET Entity Framework and NHibernate, but a little slower than LINQ to SQL and dapper-dot-net because of extra-features it provides (gets values by column name and not by an ordinal making it easier to maintain, uses default values so developers don’t have to hassle with DbNulls).

In this post we’ll see how to lower these loading times using a few techniques.

Step #1: Removing extra features

If all you need is performance and don’t care about all those extra features provided by the tool by default: disable them! You’ll sure gain milliseconds without:

  • concurrency management (select your entity and set “Concurrency Mode” to “None” in the property grid),
  • property tracking (select your entity and set “Default Property Tracking Modes” to “None” in the property grid),
  • entity tracking (select your entity and set “Tacking Modes” to “None” in the advanced view of the property grid),
  • ensure the collection type is set to “ListCollection” or “Collection” and not “List” as you’ll waste time each time the Contains method is called.

Doing this, and without writing custom code yet, you should manage to gain a few precious milliseconds.

Step #2: Squeeze strongly typed collections

In the “CodeFluent Entities Performance Comparison” post, the code used in the test was as follows:

Stopwatch sw = new Stopwatch();

for (int ct = 0; ct < 100; ct++)
    OrderCollection orders = OrderCollection.LoadAll();
    foreach (Order o in orders)
         int i = o.OrderID;

Console.WriteLine("Elapsed milliseconds: " + sw.ElapsedMilliseconds);

From lines in the database an OrderCollection was built, but you might not even need an OrderCollection object. If so, you can use the PageData<MethodName> method which returns a IDataReader object. Which lets you do something like this:

Stopwatch sw = new Stopwatch();

for (int ct = 0; ct < 100; ct++)
    foreach (Order o in LoadOrders())
        int i = o.OrderID;

Console.WriteLine("Elapsed milliseconds: " + sw.ElapsedMilliseconds);

As you can see we replaced our OrderCollection.LoadAll() call by a call to a custom method named LoadOrders. Here’s the method:

static IEnumerable<Order> LoadOrders()
    using (IDataReader reader = OrderCollection.PageDataLoadAll(null))
        while (reader.Read())
            Order o = new Order();
            o.RaisePropertyChangedEvents = false;
            yield return o;

In the code above we’ve:

  • used CodeFluent Entities to generate the PageDataLoadAll and the ReadRecord methods which allowed us to load and map results to .NET objects,
  • yet we didn’t build and OrderCollection but returned an IEnumerable<Order>,
  • extra-tip: we disabled RaisePropertyChangedEvents before filling in our object to gain some more valuable milliseconds Winking smile

Thanks to this last tip, it allowed to gain around 140ms which is pretty good given the reasonable amount of coding (14 lines).

Step #3: Do it yourself!

The last code chunk which makes our time consumption superior than in yesterday’s handmade test using a SqlCommand and doing the mapping yourself is the generated ReadRecord method. This method provides two features:

  • it retrieves values by column name instead of using an ordinal to make the code more readable and robust (changing the column order won’t break your code),
  • it uses default values for value types so you don’t have to struggle with DbNulls.

Consequently, if all you need is performance and don’t care for those two features, just write your own ReadRecord method such as this one:

public void CustomReadRecord(IDataReader reader)
    if (reader == null)
        throw new System.ArgumentNullException("reader");

    if (!reader.IsDBNull(0)) _orderID = reader.GetInt32(0);
    if (!reader.IsDBNull(1)) _orderDate = reader.GetDateTime(1);
    if (!reader.IsDBNull(2)) _requiredDate = reader.GetDateTime(2);
    if (!reader.IsDBNull(3)) _shippedDate = reader.GetDateTime(3);
    if (!reader.IsDBNull(4)) _freight = reader.GetDecimal(4);
    if (!reader.IsDBNull(5)) _shipName = reader.GetString(5);
    if (!reader.IsDBNull(6)) _shipAddress = reader.GetString(6);
    if (!reader.IsDBNull(7)) _shipCity = reader.GetString(7);
    if (!reader.IsDBNull(8)) _shipRegion = reader.GetString(8);
    if (!reader.IsDBNull(9)) _shipPostalCode = reader.GetString(9);
    if (!reader.IsDBNull(10)) _shipCountry = reader.GetString(10);
    if (!reader.IsDBNull(11)) _customerCustomerID = reader.GetString(11);
    if (!reader.IsDBNull(12)) _employeeEmployeeID = reader.GetInt32(12);
    if (!reader.IsDBNull(13)) _shipViaShipperID = reader.GetInt32(13);

Using this CustomReadRecord instead of the standard ReadRecord, I obtained even better times than the handmade data access test (possibly because CodeFluent Entities executes stored procedures whereas I was using SqlCommand in my handmade test?).

Hope this helps,

Carl Anderson

  1. No comments yet.
  1. No trackbacks yet.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s