Bulk Data Loader for Dynamics 365 is a new cloud service built by the Dynamics team. The main purpose of this service is to enable bulk import/export of data into Dynamics 365 Online. This tool will allow uploading large data files to cloud staging tables where you can perform data quality functions and push the data into 365 Online. Using data loader, you can move your data between flat files and Dynamics 365, and cut down on implementation costs.
Quick and easy to configure
Eliminate writing custom code against CRM SDK for importing data and thus cutting down time and cost
Support bulk loading of data
It’s available at no cost
- All data uploaded is encrypted
- Support for update and creates
- Support for flat files with any delimiter
- Edit and re-use data mappings
- Excel app for fixing invalid data in the staging DB and iterate over the data
- Parallel processing to support bulk loads
- Import of multiple entities in one data project
- Handles auto detection of insert order and relationships
- Imports historical data like closed activities, older Created date
- Achieve high throughput
Configuring Data Loader Service
There are 2 steps that need to be completed before this service can be used.
Step 1: Deploy the Data Loader runtime for a specific CRM organization
Data Loader for every CRM organization needs to deploy a new runtime module to ensure data isolation across organizations and to stay closer to the data center of the specified CRM organization.
1. Click on “Deploy runtimes” tile
2. Click on the “+” button and fill in the data in the pop-up screen. Select the CRM organization that you would like to import data to and click “Deploy“ button on the page.
3. Deploy will take approximately between 15–30 minutes.
4. Your deploy is ready when the grid reflects “Running” status as below
Step 2. Configure the flat file format.
1. On the main dashboard click on “Configure file format” tile.
2. In the “Configure file format” page, click on “+” and enter the necessary information pertaining the flat file format. Here is the example of the standard CSV format.
3. Click save.
Now you are ready to start importing data into CRM organization.
1. Click on the “Import” tile on the dashboard
2. This will start a wizard, follow the steps. Here you can upload data to more multiple entities as needed.
3. On the “3. Map fields” step, the service will do the best effort matching, of the source file columns to the target entity. For the unmapped fields, you could choose to complete that mapping or ignore.
Another key point to note is, the drop down for target fields also displays any alternate keys that were defined on the lookup entities. So you can map the alternate keys for your lookup columns.
4. At the end of the wizard, give a name to the project and click on “Start data job”. This will start processing the uploaded files and import them to the cloud staging environment.
This is not starting the “Import to CRM” yet.
Viewing the Job execution details
1. Click on the specific job’s card view on the dashboard.
2. This will open the Job Details page, which is divided into 3 tabs:
- Source tab. This will display the success or error encountered while processing the files into the staging table. If there are any records which the service was not able to import into staging, it will display it as an error with details.
- Staging tab. This is one of the important parts of this service. This reflects all the records that were successfully imported in the cloud staging, in CRM, in errored etc. In this view, user can run data quality validations and fix errors.
- “Import to CRM” tab. This tab displays the progress status after import to CRM has started.
Run data quality services
1. On the Jobs details page, click on “Staging” tab
2. To run DQS, click on “Validate” as shown below. This will validate two things, metadata validation and look up validation. Depending upon the number of records this might take a few minutes, and you could start to validate on all the entities parallel if needed.
3. At the end of validate, it will change certain records into “Not Valid” status — see the pic below.
Fix errors in Excel
1. User can fix the errors in Excel, by clicking on “Download to excel” button. This button is in the staging tab as shown below.
2. This will open excel. Click “Sign in” and enter the CRM credentials. Once authenticated, this will load all the errored records.
3. Review the error message and fix the records. After the records are fixed, click on “Publish”. This will publish the fixed records back to staging. You would need to refresh the data in the staging grid to see the changes.
Start Import to CRM
After the data has been validated and fixed in staging, you are ready to start the import to CRM action. This will basically start import for all the entities in the current data job.
1. Click on “Import to CRM” — as seen below.
2. Once the import to CRM has started, the progress of the import can be viewed in “Import to CRM” tab
Any records that did not make it in CRM will remain in the staging grid in “Errored” status with detail error message that was received from CRM web service. The user can review the error messages and fix the records again in Excel and retry the import. This service enables iterative import of data until all the records that are required are imported.
Refresh CRM Metadata in Data Loader
Once the Data Loader runtime is deployed, and thereafter if there are metadata changes or customization added on CRM side, then you would need to follow the steps below to refresh the metadata in Data Loader.
1. Click on “Deployed runtimes” on the main dashboard
2. Select the CRM organization in the grid and click on the refresh icon over the grid.
This will start refreshing the metadata. It might take several minutes before the refreshed metadata is reflected.