This post was written to show you the performance considerations when developing an application using Entity Framework with CData ADO.NET Provider for Dynamics CRM 2015, to give you an overview of Entity Framework algorithms that can affect performance, and give you few tips for improving performance in your applications that use a mixture of Entity Framework and CData’s ADO.NET provider. As I wrote in my previous post Getting Started with LINQ and CData ADO.NET Provider for Dynamics CRM 2015, CData’s provider allows developers to build .NET applications with connectivity to Dynamics CRM using the familiar ADO.NET interfaces. The provider abstracts the underlying data source into tables, views, and stored procedures that can be used to both retrieve and update data.

“Writing code often leads to errors, and errors leads to learning. Learning leads to understanding” (Hosk)

What the Entity Framework is…?

Microsoft’s Entity Framework is an extended Object-Relational Mapping frameworks are a convenient way to provide an abstraction for data access in an object-oriented application. With any abstraction though, however the performance matters!

Executing “Cold” or “Warm” Query

First time any query is made against a given model, the Entity Framework has to do lot of work behind to load and validate the model. According to Microsoft’s terminology we refer to this first query as a “cold” query. Further queries an already loaded model are called as “warm” queries, and much more faster. Yesterday I showed you some LINQ query no we analyze one of the in performance point of view. This is a simple query:

MSDynCRMEntities context = new MSDynCRMEntities();

            var LeadQuery = from Lead in context.Leads
                            orderby Lead.FirstName
                            select Lead;

            var result = LeadQuery.First();

Let me show you an deeper insight of query execution:

First Execution – COLD!!!

Code Action Performance Impact
MSDynCRMEntities context = new MSDynCRMEntities(); Context creation Low
var LeadQuery = from Lead in context.Leads
orderby Lead.FirstName
select Lead;
Query expression creation Low
var result = LeadQuery.First();  Execute LINQ query
 Load Metadata  High, cached
 Generate view  Medium, cached
 Evaluate parameter  Low
 Translate query  Medium, cached
 Execute query against Database  High
  • Connection.Open()
  • DynamicsCRMCommand.ExecuteReader()
  • DynamicsCRMDataReader.Read()
 Materialize object  Medium
 Identity lookup  Medium
 Close connection  Low

Second Execution – WARM!!!

Code Action Performance Impact
MSDynCRMEntities context = new MSDynCRMEntities(); Context creation Low
var LeadQuery = from Lead in context.Leads
orderby Lead.FirstName
select Lead;
Query expression creation Low
var result = LeadQuery.First();  Execute LINQ query
 Lookup Metadata  Low
 Lookup view  Low
 Evaluate parameter  Low
 Lookup query  Low
 Execute query against Database  High
  • Connection.Open()
  • DynamicsCRMCommand.ExecuteReader()
  • DynamicsCRMDataReader.Read()
 Materialize object  Medium
 Identity lookup  Medium
 Close connection  Low

In the following section we’ll take a look how to reduce the performance cost of both cold and warm queries.

To reduce the cost of model loading in cold queries we might use the pre-generated views, which help you to bear performance pains during view generation. Secondly, for warm queries, we might use query plan caching, disable change tracking, and different options of query execution.

Use Pre-Generated Views to reduce the cost of view generation

In order to understand how Pre-Generated View works, I first provide you a very short overview of view generation.
Entity Framework is not tied to any any specific database but using a povider model (in this article we use CData ADO.NET Provider for Dynamics CRM 2015 to connect CRM database) that allows connectting various databases. To be able to do this, Entity Framework builds a set of views:

  • Query views – transformation necessary rom the database schema to the conceptual model.
  • Update views –  transformation necessary from the conceptual model to the database schema.

The process of computing these views based on the specification of the mapping is what we call view generation. View generation can either take place dynamically when a model is loaded, or at build time, by using “pre-generated views”; the latter are serialized in the form of Entity SQL statements to a C# file.

Finishing the view genaration, the next step is the view validation. This process is a costly operation because it ensures that the connections between the entities make sense and have the correct cardinality for all the supported operations.

To reduce the response time when the first request is executed, you should take advantege of Pre-Genarated Views.
You can reduce this overhead by pre generating the view for the EDMX file using the EdmGen.exe command line tool or T4 templates.

If you’re using Entity Framework 6 you can get the view generation T4 templates from the Visual Studio Gallery: EF6 CodeFirst View Generation T4 Template for C#

Move your model to separate assembly

Including model directly in your application’s project, view generation and validation will take place whenever the project is rebuilt, even if the model was not changed. Move your model to a separate assembly and reference it! Do not forget to copy the connection string into the application configuration file of the client application. If the model was changed please rebuild it again!

Break up a large entity data model into smaller ones

Break up a large entity data model into smaller ones

Ensure that the Entity Data Model represent a single unit of work not the entire database with many objects (such as tables, views, stored procedures, etc.) that are disconnected or are not needed for a particular unit of work. If Entity Data Model is representing the entire database when it is not needed, it can decrease the application’s performance because of the need of loading many unnecessary objects in the memory. You should break up a large entity data model into smaller ones with each model representing an unit of work.

Disabling change tracking

If you use a “read-only” scenario – you would just want to retrieve data and updates on the data read aren’t needed at all – to avoid the overhead of loading the objects into the ObjectStateManager, you can issue “No Tracking” queries. Change tracking can be disabled at the query level. To switch the mode of a query to NoTracking, chain a call to the AsNoTracking() method in the query:

 MSDynCRMEntities context = new MSDynCRMEntities();

            var LeadQuery = from Lead in context.Leads.AsNoTracking()
                            orderby Lead.FirstName
                            select Lead;

            var result = LeadQuery.First();

Use Compiled Query to improve performance with LINQ queries

Since a query is issued against Dynamics CRM database using Entity Framework and CData provider, it must go through a series of steps: one of these steps is Query Compilation. Entity Framework 5 introduced automatic caching for LINQ queries. The earlier version of Entity Framework created a CompiledQuery to speed the performance, now the caching is done automatically without the use of CompiledQuery.

Cheers!

First Name

Last Name

Email address:

The post How To Improve Application Performance Using Entity Framework and CData ADO.NET Provider for Dynamics CRM 2015 appeared first on Zsolt Zombik - Dynamics CRM Blog.