Dynamics 365 Finance and Operations: BYODB – Bring your own database / data warehouse
Introduction
Microsoft Dynamics 365 Finance and Operations is a great ERP product however, no matter how strong it is, it is not the right place to store big volumes of the company’s transactional data (also think about the associated storage cost for a typical D365-FinOps cloud deployment). When faced with that sort of requirement your focus should be to store such historical transactions in your favourite data warehouse not directly into your ERP. This definitely does not mean that that your data warehouse data should remain disconnected from your ‘more current’ ERP data. Microsoft is working very hard, especially in the latest releases, to blur the ‘divide’ between these two typical type of data-sets in organization.
By now, everyone is familiar with the fact that FinOps comes with a separate data warehouse for cubes which can be used for your PowerBI reporting. If you go to Systems administration > Setup > Entity store, you will find the list of aggregate measurements which are available in your data warehouse (if you can see a measurement in the AOT but not in this Entity store page, make sure you ask a developer to set the property StagedEntityStore to true). Clicking the Refresh button in this form will refresh your entity store. This is all well and good but actually this database is within the same eco-system provisioned by Microsoft. It is called ‘AxDW’ and stand close by your ‘AxDB’. It is also visible and can be managed (in terms of authentication etc) from LCS as can be shown in the LCS screenshot below.
But back to the original story, what if the company wants to implement its own data warehouse solution outside of the LCS eco-system where it integrates all historical transactions and statistics and new generated the data from their current Dynamics 365 FinOps ERP. What if you want to user other reporting tools apart from PowerBI? If these tools require direct T-SQL query on the database? You cannot fire direct T-SQL statements on a Microsoft-managed D365-FO production environment. What if you want to use the data warehouse to integrate to other systems? Well here comes the recent add-on of Bring your own database (BYOD) to the rescue! And the good news is that is insanely easy to configure a base-line meta-data and data export from FinOps and you can configure it from the front-end without even needing to involve a developer (all you need to understand is the concept of a connection string). If you can’t believe simple check-out the “How-To” section within this same blog post.
Entity Store vs BYOD
While the ‘weakness’ (if you can call it a weakness) of Entity Store is that it tightly associated to PowerBI, its strength comes from the fact that it is more tightly coupled to D365-FinOps. This means that access to data reported using this mechanism can also be secured using standard security principles of D365-FO. The following slide conveniently summarized when to consider Entity Store and when to consider BYOD.
Please also consider any GDPR implications there might be if you publish any peronsally-identifiable data to your data warehouse as it might have an implication on “the right to be forgotten”.
SQL Server vs SQL Azure
As things stand at time of writing BYOD only works with SQL Server and SQL Azure. I cannot find any reference for support for other databases such as NoSQL Azure Cosmos DB but hopefully we’ll get there some day. Also keep in mind that SQL server should only be used for testing in a OneBox. For Tier-2 plus environment it looks like it is mandatory to use a SQL Azure database to publish your information from FinOps.
How-To
Now let’s go through a quick walk-through of how to publish to an Azure SQL BYOD. For other topologies you can create a blank local SQL Server database from SQL Server management studio and start at step 7.
Step 1: Login to https://portal.azure.com and login (with the appropriate rights) to the same tenant where you are running your D365-FinOps (Tier 2+) environment.
Step 2: Click on “SQL Databases” (or search for it if it is not pinned to your favourites bar) and click on “Add”.
Step 3: Select a database name, subscription, resource group, collation and it you can start off with a blank SQL database in the “Select source field”. As a pricing tier you can select “Standard” but please take note of something related to this which I will mention later in this blog-post. If this is the first SQL Azure database you are creating in this tenant you must also create an SQL Server.
Step 4: Click on the “Create” button and Azure should start provisioning you SQL Azure database
Step 5: When deployment is complete you should end up with a blank database (no meta-data and no data) and you can proceed to step 6. Please ensure you create an Azure SQL username and password as these will be part of the connection string. For the non-Gurus out there, this is called SQL authentication.
Step 6: Login to your freshly provisioned SQL Azure database using the credentials created and click on Settings > Connection strings. Use the copy function to copy the connection string presented in the Ado.NET tab as shown in the screenshot below.
Step 7: Browse to your D365-FinOps instance you would like to publish data from. Go to the “Data management” workspace and click on the “Configure entity export to database” tile, as shown on the screenshot below.
Please note that the label of this form says “Entity store” but fear not, this is not the standard PowerBI entity store but we will be configuring a BYOD. Click on the ‘New’ button in the ribbon and you will be presented with the screen shown in the screenshot below. The fields highlighted in yellow are the most important.
- Name and description: Self-explanatory
- Type:– seems to be hard-coded to Azure SQL in Tier2+ environments
- Connection string:- Paste the Azure SQL connection string you copied from the Azure Portal
- Create clustered column store indexes:- Now this a tricky one. If you selected a standard pricing tier for your Azure SQL DB then ensure that you switch off this flag else you will get errors when you try publish D365-FinOps data entities to the entity store. Why? Azure SQL standard tier does not support the creation of clustered column indexes. This is only supported in the Premium Tier of Azure SQL. Unfortunately the flag is switched on by default so if you miss it and you are running on Standard pricing tier you will get stuck in the next steps. So please beware.
- Enable triggers in target database:- This option, as per Microsoft Docs, sets export jobs to enable SQL triggers in the target database, if enabled. This option lets you hook downstream processes into the trigger to orchestrate actions that must be started after records have been inserted. One trigger is supported per bulk insert operation. The size of the bulk insert is determined by the Maximum insert commit size parameter in the Data management framework.
When you’ve finished setting up the above field click the “Validate” button and if all goes well you should get a message that your connection string is valid and it will be automatically get hidden (covered with ****). If need be, you can make the connection string temporarily visible again using the new button which becomes visible. Save the record and exist to the previous screen with the grid listing your data warehouses.
Step 8: Click on the “Publish” button in the ribbon and you will be presented with a list of your data entities. Select the data entities you would like to publish to SQL Azure and click on the “Publish” button in the ribbon. Please note that at this point we are just publishing the meta-data (i.e. the SQL schema) to Azure and not actually sending any data. Please also note that some of the entities do not support tracking deletes so check with your legal team for any GDPR implications there. When publishing is finished you should see the table schema published to SQL Azure. This can be seen by browsing to your SQL Azure database, clicking on the “Query editor” and expanding the “Tables” branch of the tree. You can run any SQL query on the right-hand side of the screen within the query editor.
Step 9: Browse back to the “Data management” workspace and click on the “Export” tile. Enter a g
“Group name” and “Description” for your export project and click on the “Add entity” button on the lines level. In the drop-down dialog “Target data format” field you should find an option pointing to your Azure SQL data warehouse we just created in the previous steps. Select it. You can also specify whether you an incremental push or full push. When ready click “Add” in the dialog and “Export” from the ribbon to run the export project. You can also setup scheduled batch exports as per standard functionality of the Data management module.
Step 10: When the export is ready your data has been successfully exported to your SQL Azure database. You can confirm this by running a simple T-SQL query on the SQL Azure database from the query editor window in the Azure portal and clicking on the “Run” button.
That was really easy, right? I think Microsoft did a very good job in terms of ease of configuration for the BYOD feature and looking forward to more!
Happy (BYOD) Data Warehousing!
*This post is locked for comments