Tips and Tricks: Importing Master Data

Tips and Tricks: Importing Master Data

August 26, 2019

Dynamics 365 for Finance and Operations Data Import Export Framework is the principle means of populating master data from legacy systems. Microsoft has provided a number of predefined data entities to support the effort.

Data Entities provide the logic for importing and utilizing business logic to ensure the integrity of data.

Below are some tips and tricks we learned during the process.


Preparation is key. During the process of exporting, transforming, and importing data there will certainly be revisions but more preparation at the beginning will minimize issues coming out of System testing and User acceptance testing.

  • Communication – One of the most important aspects of a successful import is to communicate with the client concerning ambiguities related to the requirements. Never make assumptions on how data should be transformed from the legacy system. Questions will arise during the export, transformation, and import process. Resolve these before user acceptance testing.
  • Understand the requirements – Keep in mind the client may not understand the underlying data relationships for the data they need imported. It’s important to convey pre-requisite data requirements for the master data. Microsoft provides data entity templates to assist in determining data dependencies and load sequence to support it. Understanding the dependencies will help define effort and changes to scope.
  • Define the approach – Define the export , transformation, and import strategy as much as possible up front to give a clear estimate of effort involved. This also provides a road map/check list during the import process.

Export process

There is a good chance data in the legacy system won’t align 100% with the target system. Below are some tips towards a successful export strategy.

  • Business logic – It’s possible the raw data in the legacy database requires business logic to be applied. If the legacy system has a Microsoft Data Import Export Framework with supporting data entities for the underlying data, it will incorporate business logic during the export process. So don’t assume exporting directly from tables is an accurate representation of the data.
  • Virtual Companies – Be aware Dynamics 365 for Finance and Operations doesn’t support virtual companies. If the legacy system does, then a decision will need to be made on how to handle the transformation. Also keep in mind it’s possible the legacy system didn’t start with virtual companies, so entities such as Customers, and Vendors may only need exported for the virtual entities, not the original data entity.
  • Export method – There will be cases when the legacy system won’t align 100% with the target system, even if the legacy system has Microsoft Data Import Export Framework entities, so deferring to exporting data through SQL maybe an ideal option for some entities. Just be aware of business logic surrounding the records being exported. Whenever possible use the Data Import Export Framework.
  • Export format – Comma delimited format has been our best approach. There are some caveats associated with it. Be aware exported data can have unbalanced quotation marks, embedded carriage returns and linefeeds, which can affect the transformation process. This is especially true for address, and notes.
  • Download DMF Entities and Mapping – If a Microsoft Data Import Export Framework is being used for export, be sure to download the DMF Processing groups as backups should the source system need refreshed.


Transformation process

Most likely the source data structure isn’t going to match the target system. This is resolved through a transformation process. Below are some tips to the transformation process.

  • Field lengths – Some legacy data fields are not the same length as the corresponding fields in Microsoft Dynamics 365 for Finance and Operations. Be prepared to accommodate.
  • Transform during export – A number of tools help with the transformation process, such as Microsoft Excel, Notepad++, and Microsoft Visual Studio. These tools imply a separate process for transforming data especially if the primary export method is through the Microsoft Data Import Export Framework. Because of issues with data containing embedded unbalance quotation marks, embedded CR/LF combinations, it’s possible these tools won’t properly align record for record, but split a data record into multiple records, which will cause problems during the import process. It’s best to embed transformation processes within a singular export step.
    • The Microsoft Data Import Export Framework – This method offers the ability to substitute values from the source system within the entity mapping. It’s the best approach to mitigating the need for a secondary transformation step.
    • SQL Server Management Studio – If the export process requires a SQL Query to produce the data, it’s ideal to exclude records with unbalanced CR/LF, or embedded quotes for the main data, and work with the client to manually handle the exclusions. The idea is to minimize additional steps in the transformation process, and work with minimal exception data.
    • SSIS package – Another approach to transforming is to use Microsoft’s SQL Server Integration Services. This allows transformations during the collection of data from the source system.
  • Transform as a separate process – Transforming data as a separate process is the least favored approach, often leading to manual manipulation of data, and prohibitive towards automation. The following tools are useful for manual transformations.
    • Transform using Excel – Be careful performing transformations in Excel. One approach is to open a CSV file in Excel, make some changes then import into the target system but this often has undesirable results. Excel is good for identifying issues within the data, but caution needs applied when changing data.
      • Numeric data – By default Excel formats columns with numeric only data as a numeric field but there are times when leading zeros in a number are important, and there is a good chance a long number will be interpreted as scientific notation. The best way around this issue is to import the data converting all the columns to TEXT during the import process.
      • CR/LF vs LF – Editing a record in excel, which has a CR/LF in any of the columns, will result in a separate row, even if the CR/LF is surrounded by quotes. One way to determine if this is an issue is to turn on filtering for the first row, and check the selection range from the first column to see if the filter set matches the naming convention of the column.
      • Commas within quotes – Editing a record in excel, which has a comma surrounded by quotes within a column will also result in a separate row being created per comma. This can have undesirable results during the import process.
      • Unbalanced quotes – Excel is excellent for identifying unbalanced quotes. Unbalanced quotes can cause issues with the import process, especially if quotes, and commas, and CR/LF are part of the destination record formatting parameters. One way to determine if an unbalanced quote exists, is to import the data into Excel, and ‘Find’ quotes. If any are found the import process will have problems with that record. To resolve the issue, remove it using a text editor, or modify it using a text editor.
    • Transform using a text editor – If manual transformation is required there are some good text editors. Text editors can preserve data, which have both CR/LF and LF combinations within the file. Be careful of using Copy and Paste functionality. Some editors will translate a LF to a CR/LF or vice-versus depending on the operating system. Some text editors have macro functionality to assist with a degree of automation.

Import process

Microsoft Dynamics 365 for Finance and Operations principle means of populating tables is through the Data Import Export Framework using Data Entities. The list below are some tips and tricks for that process.

  • Number sequences – During the configuration and setup of Microsoft Dynamics 365 for Finance and Operations, it’s possible number sequences have been consumed. If the source’s export data contains references to Invoice numbers, Customers, Vendors, or any identifying field, which uses number sequence generation in the destination system, there may be some overlap between the source and destination. If that is the case there are a couple actions, which can be taken.
    • Increase next number value – Increase the next number value to be higher than the last number assigned in the legacy system. It would be ideal for this to be done as first steps in the setup and configuration, but if it can’t then the transformation may need to accommodate overlap by mapping source references past those already assigned in the destination system.
    • Set number sequence to manual – In some cases the number sequences in the destination system may need set to manual prior to importing. This is true for Vendors, Customers, Logistics Locations, etc, if you want to preserve the legacy relationships, and references between the entities.
  • Source referential integrity – The postal addresses, Contact information, Vendors, Customers, Opportunities, Prospects, and Global address book have complex relationships surrounding the DirPartyTable, LogisticsLocations, Postal addresses, Contact information tables. When importing data from a Microsoft Dynamics legacy system the import strategy could accommodate for this complexity by preserving the PartyNumbers and LocationIds in the source system. This would negate the need for nested imports of the data relationship, and allow them to be imported separately.
  • Download Project Groups – Once the entity mapping is complete, and proven to work. Download the Project Groups so they may be imported into production when it’s time to go live. Validate the mapping once imported to make sure all fields are mapped as expected.
  • Record processing times – For each manual import step, record elapsed time in the UAT environment, so the client has an idea of the down time needed when performs the same functions in production.
  • Resolve dependencies first – It’s possible the client may prefer manual data entry for underlying dependencies depending of volume and effort. If this is the case, be sure to resolve import dependencies first before proceeding through the remaining imports, to minimize resubmissions from staging. For instance, if the customer import entity fails for some customers, resolve them first before importing the addresses.
  • Staging errors – During the import process to the Staging environment, there may be errors. The form, which gives visibility to the records having errors, only shows the error description for the selected record. We found reviewing the DMFStagingValidationLog table through SSMS during import to a non-production environment gives better visibility to errors should more than a few be encountered.