Breaking news from around the world
Get the Bing + MSN extension
Now Available in Community - MBAS 2019 Presentation Videos
Catch the most popular sessions on demand and learn how Dynamics 365, Power BI, PowerApps, Microsoft Flow, and Excel are powering major transformations around the globe. | View Gallery
2019 release wave 2 Discover the latest updates to Dynamics 365Release overview guides and videos Release Plan | Early Access Availability
Ace your Dynamics 365 deployment with packaged services delivered by expert consultants. | Explore service offerings
Connect with the ISV success team on the latest roadmap, developer tool for AppSource certification, and ISV community engagements | ISV self-service portal
The FastTrack program is designed to help you accelerate your Dynamics 365 deployment with confidence.
FastTrack Program | Finance TechTalks | Customer Engagement TechTalks | Talent TechTalks | Upcoming TechTalks
When Microsoft announced one year ago that XRM would become CDS v2.0 (officially Common Data Service for Apps), there wasn’t yet any big system redesign implemented to make this a physical reality. Today we are much further down that road where CDS truly becomes a Service that has less and less to do with the familiar XRM databases that we’ve previously been working with. In this blog post I’ll explore the three data related dimensions that give us an indication of where CDS is heading as a part of the Microsoft Power Platform.
As a part of the April 2019 release train, MS is changing the way how data storage is managed for both Dynamics 365 and PowerApps customers. It hasn’t been an official feature bullet on the release notes document, but that doesn’t mean its significance would be any less than what the shiny apps demonstrated in the April 2nd Virtual Launch event have.
A new version of licensing guides for Dynamics 365 and also for PowerApps and Flow (for the first time ever!) was released in April. This outlines the commercial impact of the new model to customers, which is probably what most of us will have first paid attention to. Yeah, whenever the pricing mechanism of a widely used MS cloud service changes, it will be a big deal. What makes it even trickier is that MS considers storage as a “subscription add-on” for which they don’t publicly disclose any per GB list prices. I’m not entirely sure this model is beneficial for their ambitions of turning Power Platform into an actual foundation for building third party and customer specific apps, but I guess the shadow of the old CRM and ERP world still looms above this world when it comes to licensing and pricing practices.
Let’s forget licensing for a moment and focus on the technical changes for Dynamics 365 online environments. All of the existing data that used to be stored in the Azure SQL relational database will in the future be divided into three specific storage types: database, file, log. This should have no immediate impact to customers, as the migration will be taken care of by MS. Their promise is that nothing should change in the way how users and developers work with data, since the APIs that govern access to this data will remain unaffected.
File data will be in Azure blob storage, as this is the most efficient way to handle miscellaneous documents, images and other “stuff” that may end up inside a typical Dynamics 365 system via features like email tracking that carries over the attachments. Why would you ever store this in a relational SQL database to begin with? Well, the simple reason is that the original on-prem architecture of XRM had no other secure place to put these items, so it was all lumped up there. Now when CDS is a native cloud service, there are much more options available.
Log data will be in Cosmos DB. This will probably offer a more suitable architecture for managing things like plugin trace logs, audit data and other items of similar nature. What should be noted is that Microsoft’s plans don’t just stop at this IT admin activities level. In a recent podcast by MVP Mark Smith, we heard the General Manager of Power Platform, Charles Lamanna, describe this storage type to be designed as the future place for other types of observational data, too. Charles referred to things like IoT device sensor data, which should give you an idea of how this again is data that is A) relevant to many CRM use cases and B) in no way optimal to be stored inside that relational XRM database.
One significant and very welcome change that is introduced as a part of this new model is that there will no longer be any license cost tied to the number of instances you have in the cloud. Previously you had to buy add-on licenses for acquiring production and non-production (sandbox) instances for developing, testing, training and in general managing your complex Dynamics 365 online environment. Once the new subscription terms kick in, you’ll have the ability to create as many instances as you like, provided that you have sufficient database capacity available. A major driver behind this change is surely the PowerApps side, in which the licensing terms already granted any user with PowerApps P2 license to create 2 CDS environments for their applications. (For more details, see my presentation on Demystifying Dynamics 365 & Power Platform licensing.)
In the short term, this storage model change should not result in much functional changes for the Dynamics 365 customers. Depending on when your current subscription renewal date is, the new terms will be applied either at that point in time or the renewal after that (if you choose to hold on to the old model for one more subscription period). Any new customer will likely be leveraging the new pricing model starting from April 2019.
It’s important to understand that the actual data storage technology change and the commercial terms that are applied are not tied to one another. Migration of your Dynamics 365 data to the new database/file/log model will probably take place much sooner than what you’ll see in your subscription fees. Refer to the admin documentation on Common Data Service storage capacity for details on how you’ll be able to analyze and manage your storage consumption in this new model.
When looked at purely from the storage license model changes for Dynamics 365 customers, the story would end here, with the three storage types. However, the bigger picture of how data is used as a part of the Customer Engagement systems that cover various digital touchpoints is much broader. Or should I say “bigger” as in Big Data? As much as I dislike the casual use of tech marketing hype terms like Big Data and Artificial Intelligence, there’s no escaping the fact that the familiar world of CRM systems founded on SQL databases is being disrupted by what machine learning models and big data systems can offer today.
If you’ve taken a look at the Microsoft Business Applications product portfolio recently, you will have come across many new additions that have been built on top of Azure Data Lake. Products like Dynamics 365 Sales Insights or Customer Service Insights don’t even start with the assumption that you are using a Microsoft based CRM system, but instead promise to integrate with a growing number of systems from competitors like Salesforce or Zendesk.
Not only is this a clever way to approach organizations who are not yet heavily invested the MS cloud ecosystem but it also reflects the reality of where customer related data is actually located and managed. New systems are popping up to serve the growing number of digital touchpoints that companies need in order to offer competitive end customer experiences. The process of integrating all of this new information into well structured data warehouses isn’t scaling to meet the business needs anymore, hence the rise of data lakes that don’t even try to build the single “right” data model to cover every aspect of the customer journey but rather allow the data to be poured in with the natural format of the source system.
The Azure Data Platform promises to do many wonderful things on top of the latest Azure Data Lake Storage (ADLS) Gen 2, at least for those who are fluent in the universal language of mathematics. The barriers for organizations to start leveraging technologies like Azure Data Lake used to be fairly high, since you needed experts who were comfortable with swimming in the sea of data science terminology to get your project up to speed. What we’re seeing from the MS Business Applications group now is a set of productized offerings that make it possible for the same people who’ve worked with customer data in the traditional CRM era to take steps towards data lake based systems on their own.
Dynamics 365 Customer Insights (the successor to an earlier product with the exact same name), was launched in the April 2019 release wave. CI offers a toolkit for connecting to various data sources (powered by Power Query), mapping fields to a customer centric data model (based on CDM), matching customer profiles across various systems without exact keys, enriching them with activities from the source systems, building KPI measures and customer segments, and finally acting upon this new insight derived from the data. Having an application like this in the Dynamics 365 product portfolio, designed to address marketing segmentation scenarios by working on data residing in Azure Data Lake, is a great example of how these technologies will be infused with the business processes managed via Dynamics 365.
Power BI Dataflows, on the other hand, have replaced what used to be initially called “Common Data Service for Analytics” one year ago. If you missed the memo on this, I don’t blame you, since I don’t think MS ever sent one out (at least many of their own employees still haven’t updated their slide decks). Anyway, just because there isn’t a product called CDS for Analytics anymore, that doesn’t mean the capabilities included in Power BI Dataflows would be any less relevant in the big picture of what Common Data Service as a concept represents. Now Generally Available, Dataflows promise to be “the Excel for ETL”, meaning they enable business analysts to design reusable extract-transform-load processes for common data preparation steps that would have previously been out of their reach. This is an example of the more generic capability for any data analysis need, whereas the Dynamics 365 products are targeted to more specific business scenarios.
One thing that so clearly sets apart the old XRM world and the brave new Power Platform world is their approach to data location. The traditional XRM way used to require that all of the data that we process within the application’s business logic must reside in the single SQL database. The approach of modern PowerApps based application development is that the data is in N different services and we pull it in via the APIs that are presented as Connectors to the application developer. The end user of a PowerApps Canvas App will not know that you’re actually hitting both Dynamics 365 CE and Customer Insights APIs when rendering the UI – or any other system that has either standard or custom connectors available.
While there will still be many benefits for having the customer data inside CDS, this new pattern for MS business application architecture design means that the makers of what used to be XRM apps will now need to learn how to break free from their old Model-driven way of thinking in entities and forms. The real power of PowerApps isn’t in the pixel perfect tweaking of the UI, to implement all those conditional formatting and color coding requirements that previously fell into the unsupported territory of Dynamics 365 CE. It’s this flexibility of working with various different data sources for both reading AND updating data that makes these apps capable of replacing previously tedious data integration efforts with no-code/low-code solutions.
A while ago the concept of a Virtual Entity was introduced into what was still XRM platform at the time. The documentation explains the scenario for VE as follows:
In the past, to integrate the disparate data sources you would need to create a connector to move data or develop a custom plug-in, either client or server-side. However, with virtual entities you can connect directly with an external data source at runtime so that specific data from the external data source is available in an environment, without the need for data replication.
Based on my understanding, Virtual Entities have so far not been widely adopted among Dynamics 365 CE customers, due to some of the limitations in their initial release (read only data, requirements for GUID keys in source data, limited OoB data source support etc.). The merging of XRM with PowerApps to give birth to CDS has certainly had an impact on the short term development of VE, but the future looks bright for working with data from external sources precisely because of this merger. Matt Barbour, principal platform architect of CDS, stated in his Ignite 2018 session on PowerApps custom code extensions that the Power Platform Connectors will essentially be the next evolution of what Virtual Entities were planned to be used for. “You will see Virtual Entities and Connectors become one thing” were his exact words.
In an earlier blog post I described the stages of Microsoft Cloud Business Apps evolution, from the birth of CRM Online to the rise of AI, by using the following illustration:
Both the Expansion as well as Intelligence stages are dependent on the application platform having capabilities to work with data in a far more flexible manner than what the IIS + SQL Server origins of Dynamics CRM architecture offer. Breaking free from these limitations is a very natural part of the evolution from XRM to Common Data Service for Apps, since the platform must be able to connect the new generation of business apps with the latest innovation coming from the Azure data platform.
The post The Real Common Data Service Emerges appeared first on Surviving CRM.
Business Applications communities