Friday, 7 October 2016

Performance: Metadata calls vs Exceptions

Today I was embarking on a mission to check the existence of a field on an entity. If it exists, do something, if not... well, do nothing! My first thoughts were "metadata... urgh!". As all of us in the CRM development world know, the metadata is notoriously slow. Reading stackoverflow I found that the recommended answer was always using the metadata. So I thought to myself there's got to be a better way to do this. Which gave me an idea... which performs faster, exceptions or the metadata? To hit the metadata you need to run something like this:
RetrieveEntityRequest request = new RetrieveEntityRequest
{
    EntityFilters = EntityFilters.Attributes,
    LogicalName = "account"
};
var response = (RetrieveEntityResponse)service.Execute(request);
AttributeMetadata first = null;
foreach (var element in response.EntityMetadata.Attributes)
{
    if (element.LogicalName == "xyz_fieldname")
    {
        first = element;
        break;
    }
}
var fieldExists = first != null;
I chose not to use linq/expressions when checking for the field just to try keep it as performant as possible. This code is fine in general, but boy is it slow. So I came with this as an alternative instead:
try
{
    var query = new QueryExpression("account");
    query.Criteria.AddCondition("accountid", ConditionOperator.Equal, "294450db-46c9-447e-a642-3babf913d800");
    query.NoLock = true;
    query.ColumnSet = new ColumnSet("xyz_fieldname");
    service.RetrieveMultiple(query);
}
catch
{
    // ignored
}
Using a query expression has 2 advantages, you're running the query against the primary key (accountid). You don't care about the id itself, the code will either throw an exception if the field doesn't exist, or return no records if it succeeds (1 record if you are the unluckiest guy on the planet to get a matching guid... but even then it would be faster). The second advantage of a query expression is you can run it using a nolock. You really don't care about the result set, the purpose isn't to find a record, it's to see if including the column forces an exception. So how do you test the execution of this? I wrote a console app that used a stop watch to wrap each call. The first time I ran the tests I ran each console app independently so not to skew the results by code optimization on what call ran first etc. And the results I got for a single call were the exception generally executed about 1.5 times faster. Sample code is this:
private static void RunExceptionTests(IOrganizationService service, int steps)
{
    Console.WriteLine("Testing exception with {0} steps", steps);
    var stopwatch = Stopwatch.StartNew();
    for (int i = 0; i < steps; i++)
    {
        try
        {
            var query = new QueryExpression("account");
            query.Criteria.AddCondition("accountid", ConditionOperator.Equal, "294450db-46c9-447e-a642-3babf913d800");
            query.NoLock = true;
            query.ColumnSet = new ColumnSet("xyz_fieldname");
            service.RetrieveMultiple(query);
        }
        catch
        {
            // ignored
        }
    }
    stopwatch.Stop();
    Console.WriteLine("Milliseconds taken: {0}", stopwatch.ElapsedMilliseconds);
}

private static void RunMetadataTest(IOrganizationService service, int steps)
{
    Console.WriteLine("Testing metadata with {0} steps", steps);
    var stopwatch = Stopwatch.StartNew();
    for (int i = 0; i < steps; i++)
    {
        RetrieveEntityRequest request = new RetrieveEntityRequest
        {
            EntityFilters = EntityFilters.Attributes,
            LogicalName = "account"
        };
        var response = (RetrieveEntityResponse)service.Execute(request);
        AttributeMetadata first = null;
        foreach (var element in response.EntityMetadata.Attributes)
        {
            if (element.LogicalName == "xyz_fieldname")
            {
                first = element;
                break;
            }
        }
        var fieldExists = first != null;
    }
    stopwatch.Stop();
    Console.WriteLine("Milliseconds taken: {0}", stopwatch.ElapsedMilliseconds);
}
I ran several tests on CRM online varying between multiple/single calls to the metadata vs multiple/single retrieve multiple calls throwing an exception. Exceptions always outperformed the metadata. If you're performing multiple calls within the same code it jumps to about 3 times faster (I'm guessing optimizations come into play). Because of how plugins are loaded into memory for faster execution I would wonder if it would regularly perform at 2 - 3 times faster than a metadata call. Either way, the bottle neck is the call to the Metadata service which cannot be optimized unless you introduce caching and more code complexity. Also, if the field exists you won't get an exception which means yet another performance bonus. The only scenario I haven't tested is running it against a massively heavily used entity... but if you're hitting your database so hard that a nolock retrieve cannot return in an acceptable time frame you probably have bigger problems to worry about! To conclude, can I just say the following. Micrososft, will you fix your metadata service already! It's been slow for donkeys years and is a real annoyance when you have to use it.

Thursday, 6 October 2016

Extended CrmSvcUtil - Exporting an attribute list

A little feature of the Extended CrmSvcUtil I neglected to mention in my previous post (it was a late feature!) is the ability to export a list of strongly typed attribute names. This helps remove "magic strings" from your source and introduce some strongly typed attribute checking. This is useful, even when using Early Bound, as you often need to check the existence of an attribute.

For example, let's say you are writing a plugin that fires on update of contact and you wish to include a pre image containing the parent account. What you will often see is code like this:

if (preImage.Contains("parentcustomerid") == false)
{
    //trace / throw an exception stating the parent count hasn't been provided...
}

Checking for null is not the same as checking for existence, because some contacts might not have a parent account set. So the check for existence is often quite important. Using an attribute list allows you to strongly type this instead as follows:

if (preImage.Contains(ContactAttributes.ParentCustomer) == false)
{
    //trace / throw an exception stating the parent count hasn't been provided...
}

Another area where this is incredibly useful is when building queries using Query Expressions or Fetch Expressions. If you want to include a set of columns, or set a condition on an attribute you will end up with this type of code:

QueryExpression qe = new QueryExpression();
qe.EntityName = "contact";
qe.ColumnSet = new ColumnSet();
qe.ColumnSet.Columns.Add("parentcustomerid");

Being able to specify the attribute strongly, like the following looks much better:

qe.ColumnSet.Columns.Add(ContactAttributes.ParentCustomer);

This helps work around many issues, like typo bugs or name change, like in cases where somebody accidentally creates a field called "new_ProjjectType and wishes to fix the name of the field. If 5 or 6 plugins already reference this field and perform some logic based on its value you might end up with multiple "magic strings" to fix across your code. Using an attribute list is a 1 fix solution to the problem.

The source for the Extended CrmSvcUtil can be downloaded from git hub with the latest release available to download from here

Wednesday, 28 September 2016

Extended CrmSvcUtil - A Neater Early Bound Generator

For quite some time I have persisted with late bound objects, because within plugins it makes life a bit simpler. The idea of Early Bound objects is good in theory, but the Microsoft provided CrmSvcUtil just doesn't cut it in terms of how it gives you the code. (1 big SDK file is not my idea of fun). You used to be able to split different aspects of this out, but they have removed functionality in recent versions. This has probably been the main reason I stuck with Late Bound until recently.

Issues with CrmSvcUtil

The biggest issue I have with the existing tool is the lack of naming ability. Say you have a custom entity called "new_project" with an OptionSet attribute on it called "new_projecttype". Ignoring all the other entities that will get exported regardless of whether you actually need them or not, and ignoring all the standard attributes that will be exported, the naming convention you'll end up with is this:

// Class...
public partial class new_project...

// Property...
public Microsoft.Xrm.Sdk.OptionSetValue new_projecttype

// Enum
public enum new_project_new_projecttype

I've chosen the type OptionSet in particular to convey an additional problem, but there are a host of issues with what is generated.
  1. We end up with 1 huge unmanageable file.
  2. Really bad naming convention by default that is not easy to change. Sure, you could manually edit them, but it will get overwritten with each generation of the metadata
  3. OptionSet properties created as the type OptionSetValue. Surely if we are using Early Bound then shouldn't our option set properties be the equivalent enum type? 

So the tool is quite lazy in what it does. It's a bit of a "bare minimum" to get you over the fence, and then you're left to your own devices. Quite frankly, you'd be quicker just writing the classes yourself and as long as you honor the correct attributes it would work perfectly fine.

I have investigated many tools, and all of them fell short. So this and all of the above issues caused the birth of my own pet project which has made my life a lot easier.

Extended Svc Util

It's named simply so, because all it does is extend and build on top of what the existing CrmSvcUtil does. Once the code has been generated it does not intercept the generation of the "monster file", but instead piggy backs the code generated for that to produce its own files. Let's take a look at what you can do.

To fix the problems in the above files you could set up a configuration like this:

<configuration>
 <configSections>
  <section name="schemaDefinition" type="CodeGenerator.Config.SchemaDefinition, CodeGenerator" allowLocation="true" allowDefinition="Everywhere"/>
 </configSections>
 <schemaDefinition groupOptionSetsByEntity="true" exportAttributeNames="true" entitiesFolder="..\MyProject.DomainModels" enumsFolder="..\MyProject.DomainModels">
  <entities>
   <entity name="new_project" friendlyName="Project">
    <attributes>
     <attribute name="new_name"  friendlyName="Name"/>
     <attribute name="new_projecttype" friendlyName="ProjectType" />
    </attributes>
   </entity>
  </entities>

  <optionSets>
   <!-- Global OptionSets-->
   <optionSet name="new_someglobaloptionset" friendlyName="SomeGlobalOptionSet" entity="Global" />

   <!-- Project OptionSets-->
   <optionSet name="new_project_new_projecttype" friendlyName="Project_ProjectType" entity="new_project" />
  </optionSets>
 </schemaDefinition>
</configuration>

So what does this do? Firstly, you can add friendly names to your entities, attributes and option sets. Secondly, the export will use the correct enum for your option sets rather than using the out of the box OptionSetValue. So what you'll end up with instead is this:

// Class...
public partial class Project

// Property...
public Proejct_ProjectType? ProjectType

// Enum
public enum Project_ProjectType

The next configuration item I'd like to point out is not only can you depict where the file is generated, but you can decide to group all of your option sets into 1 file per entity rather than separate classes. These are defined at the top of the configuration under Schema Definition. All of this causes 2 files to be generated named:

  • Project.cs
  • Project.Enums.cs


Finally, only entities you have specified within the list will be exported to their corresponding file, all others will be ignored. All of the code will still be exported to the output file you specify so if you wanted to double check that source against what this tool exports you can do so.

All of this makes life much easier and readable in the Early Bound world, and makes it much quicker to generate the classes exactly as you want. I have included a global option set option in there just as an example of how to deal with that. But in effect all of those option sets in this example will be exported to a file called Global.Enums.cs. You can rename out of the box fields, status fields and their accompanying enums too. So you're not just stuck to your custom entities and fields.

Source

I have uploaded the source to github (https://github.com/conorjgallagher/Dynamics.ExtendedSvcUtil). There are further instructions up there on how to utilise the DLL it builds with CrmSvcUtil. It's fully open source so feel free to download, edit, and use to your hearts delight. In the root folder of the project I have included the latest built version of the DLL, so if you just want that feel free to download it.

If you find bugs please feel free to submit a comment. I have not fully decided on how best to manage contributions, so if you are interested please contact me and we can discuss.

Enjoy!

Wednesday, 14 September 2016

CRM Actions - the mysterious erroneous prefix

Actions were introduced in CRM 2013 and are a very powerful feature of Dynamics CRM. Quite frequently you might have had a particular piece of custom code you wanted to make available in different areas within your code base, such as both JavaScript and plugins. Previously you could not easily encapsulate this code without writing something quite custom and quite frankly a little hacky. For an example of how you could previously achieve this see my blog post about Executing stand alone C# code within crm.

Actions have made this redundant as you can now very easily encapsulate a common piece of functionality that you can call in many different ways and places (See MSDN)

So that's all sweet and awesome, right?

Not quite. As with all new features of CRM you eventually weed out a few bugs, some a little more complex than others. And actions are unfortunately no different. Recently we ran into a mysterious issue on our dev organisation in that it started throwing errors any time we tried to execute an action. The issue did not surface on any of our other environments. On closer inspection I read the error message a little more closely and spotted something
 Request not supported: new_CustomActionName
...Hang on there a second tiger, "new_" is not our default prefix! Where did this suddenly come from?

One thing dawned on us. We had recently recreated our dev from scratch due to some issues with the environment. One of the particular tasks we performed before we reset it was take a backup of the default solution. This was then imported directly after the reset to get us back to base. It all looked hunky dory until we noticed that actions had been impacted by this; they mysteriously got imported with the "new" prefix instead of honouring what our original default prefix was.

The fix?

Luckily this issue had not yet infected our test environment. And doubly luckily the actions hadn't changed since we reset the environment. That meant we could delete the actions on dev, create a fresh solution on our test environment with the (still correct) actions in it, and import this back into dev

I would hazard a guess that if we had changed the default prefix before we imported the backed up default solution we may have avoided the problem. But now I will never know! To be completely safe my advice is to back up all actions in a solution with the correct publisher before you embark on a environment reset under which you intend to restore a default solution.

Friday, 17 June 2016

How to get the parent windows Xrm context in JavaScript

In the latest version (CRM 2016) the Xrm context seems to be buried in a frame. So to get it you need to do this:

window.top.opener.frames[0].Xrm.Page.data.entity.getEntityName()

Locating the parent context seems to change with different releases of CRM so I'd guess it's technically not supported. At least it has the potential of not surviving an upgrade so something to be aware of!

CRM Online - rsProcessingError

Some day in the future I indeed to cover a more detailed post for CRM Online and diagnosing why you might get rsProcessingAborted or rsProcessingError messages on SSRS reports. It's a bit more involved than on premise due to the logging access limitations of CRM Online. In an on premise environment you can simply log on to your SSRS server and grab the log to get more detail. You don't get access to the same level of logging with CRM Online.

For now I will describe a very rudimentary/old school way to analyse a report I had deployed to CRM Online throwing the following error



Not much to go on, you get a retry and a close, unfortunately no download log. Searching around CRM gave no hints as to what was causing this.

What was my particular issue?

Firstly, you might be wondering what was the exact problem I ran into on my report. It was an issue with images which we have attached to an entity. We use this to download and display images on the report. The SSRS report was set up to render a JPEG, but somebody uploaded a BMP file. This was oddly working fine from within Visual Studio, but in the Report Viewer it did not like it. I would have liked the image to not render instead of throwing an error like the above. But how did I discover the problem?

Step 1 - It works on my machine!

I started with the old developer trick to see if it ran on my machine. Open up Visual Studio and run the report for the same report/record combination. You can do this by modifying the parameter called CRM_Filterednew_entity where obviously "new_entity" refers to the entity you run the report against. This parameter is simply some fetch xml that you can build using an advanced find to filter to the record in question.

Once you have performed the above, does the report run? If not you should get an error a bit more substantial than what the Report View was displaying.

If the report does indeed run, check instead for warnings. These can often surface as a processing error in Report Viewer. Nuke all of your warnings where possible!

None of the above worked for me.

Step 2 - Trim the hedges

It's working within visual studio. So what can you do? Create a copy of the report and start trimming pieces off it to find the root cause. Start with more likely parts, for example:

Remove a complex table. Deploy the report. Does it run?
Remove some complex expressions. Deploy the report. Does it run?
Remove sorts. Deploy the report. Does it run?
Remove show/hide expressions etc. Does it run?
Remove images. Does it run?
...

This is where I stopped as it highlighted my issue, but you see where I'm going with this. Keep stripping bits and pieces until you hit what's making the report fail. You can undertake this process of elimination in quite a binary fashion as well to speed up the process. For example, remove half of the report, does the error go away? Remove the other half, does the error go away? Drill into the half with the error and repeat the process.




Thursday, 16 June 2016

Dynamics CRM - Attribute Mapping Issue

Dynamics CRM, like most software products, will have bugs. Most aren't world ending to be fair, but often those slightly less visible ones linger for quite some time. Take this issue I came across today.

We have a custom entity in our system which has 3 lookups to the account entity. Each lookup serves a different purpose:


  • Company
  • PR Company
  • Joint Broker
All of the above are a different type of account in our system, but are accounts all the same. You can see the problem that exists straight off if you take a look under the hood. Open up one of the relationships and check the mappings section. It contains all 3 of the above as part of the attribute mapping.



The issue with this is if you open an account and create a new one of these custom records (from any relationship) it will populate ALL three of the lookups with the same account. You cannot delete these mappings, or disable them. Unfortunately you are stuck with them. This bug has existed for at least 2 years going by an issue raised in CRM community forum back in 2014.

So what is the workaround?

If you need this populated based on the relationship that the record was created through, you might be in a bit of a pickle. I currently know of no supported way to do this without writing your own add buttons to the ribbons, but that is quite a lot of effort for very little reward.

On the other hand, if you only ever want 1 (or particular) lookups populated there is a way to fix it.Write a JavaScript function a bit like the following:


function FixAttributeMappings() {
    // If we are in a create form blank the attribute mappings
    if (Xrm.Page.ui.getFormType() == 1) {
        Xrm.Page.getAttribute("new_jointbroker").setValue(null);
        Xrm.Page.getAttribute("new_prcompany").setValue(null);
    }
}

Call that function in the OnLoad of your entity and they are blanked whenever you create a new record of that type.