Microsoft Dynamics 365 Blog

*** Author note *** As is mentioned in the Whitepaper itself, much of the information in the document does NOT apply to upgrading to Dynamics AX 2012 R2 and beyond. For example, if you disable the updateGABDateTimeFields method in Dynamics AX 2012 R2, because of the changes to the script itself and the R2 table structures the upgrade would end up deleting all rows from the DirPartyLocation table (thanks to Nicolas Granjon for that tip).

 

Hello all – I recently posted a new White Paper available on CustomerSource which can be found here Common Performance Optimizations for Dynamics AX2012 Upgrade Code that talks about some very common areas of the upgrade code that customers and partners have needed to optimize in order to get acceptable performance. If the link isn’t working for you, after logging on to CustomerSource, go search the Documentation – White Papers area for the document. The majority of these issues are being considered for the base upgrade code for our next version, but for those customers and partners working on an upgrade to Dynamics AX 2012, the specific script changes in this document can save quite a bit of time with larger datasets. The areas covered in this new whitepaper are listed here: 

 

Preprocessing Considerations.

Cleaning up your data.

Install and configure Dynamics Performance Analyzer

Upgrading from AX 4.0 – Address pre-processing scripts (generally namedupdateParty_XXXXX)

LedgerTrans pre-processing

 

Delta Processing Suggestions

Temporary Indexes on Original Dynamics Tables for delete_from Statements

Address Delta Processing Script Enhancements

 

Microsoft Dynamics AX 2012 Upgrade Suggestions

Index Changes for Upgrade

Recordset Update/Inserts that can help optimize runs

 

If you have run across other areas in your own testing that you would like to see about creating similar content, please leave a comment here on the blog with some detaills and I will look into adding them into the white paper. As a sample of the content, here is a very common scenario that occurs during delta processing on larger tables. Many times the routine may seem to stall with no indication of activity, but looking at the SQL Activity Monitor you can see a DELETE statement being executed.

Temporary Indexes on Original Dynamics Tables for delete_from Statements

The way that most of the delta scripts run is to use a delete_from statement with a notexists clause to remove rows from the Shadow tables when the original record in the Dynamics table has been deleted or modified. These delete_from statements will almost always include a where condition similar to the following:

                custTrans.RecID == Shadow_CustTrans.RefRecID && custTrans.RecVersion == Shadow_CustTrans.RecVersionID

For some of the larger Dynamics tables, the delta script may appear to be “stalled” without ever changing the iteration value to indicate that it is processing rows. This can appear even if no records have been altered or deleted in the table being processed. For these “stalled” jobs, check the Processes list of the SQL Activity Monitor inside the SQL Server Management Studio and see if there is a query running which would be similar to this:

 

declare @P1 nvarchar(5),@P2 nvarchar(5);

set  @P1 = ‘dmo’;
set  @P2 = ‘dmo’;

DELETE FROM shadow_inventtransposting 
WHERE  (dataareaid = @P1) 
AND NOT EXISTS 

(SELECT ‘x’ FROM inventtransposting b 
   WHERE  ((b.dataareaid = @P2) 
   AND ((b.recid = shadow_inventtransposting.refrecid) 
   AND (b.recversion = shadow_inventtransposting.recversionid))))

If that type of DELETE query is found in the Processes list of the SQL Activity Monitor, then the recommendation would be to do the following:

1.       Kill the process by right clicking on the line in the Processes list and choosing Kill Process.

2.       This step depends on if you are running in a test environment or live production with other users in the system.

a.       If running in a test environment with no other users, then within the AOT, add an index to the original Dynamics table (in the example above it would be InventTransPosting) which would be RecID followed by RecVersion (the order is important). Save and Synchronize the table to create the new index.

b.      If running in the live environment with other users, go to SQL Management Studio and create a new index directly in SQL Server on the original Dynamics table which contains the columns DataAreaID, RecID, RecVersion  (the order is important).

3.       Then refresh the upgrade cockpit and your batch job should have ended with an error – might take a little time if a rollback is needed).

4.       Choose to Rerun the job with the error and the script should progress fairly quickly past the delete statement.

Once the delta processing is done, those indexes can be removed from the system so that they don’t add more time to inserts and updates done through normal business processing. However, those indexes should be put back into place during the Single-User phase of the upgrade to guarantee quick performance during that critical down-time window.

We're always looking for feedback and would like to hear from you. Please head to the Dynamics 365 Community to start a discussion, ask questions, and tell us what you think!