Skip to main content

Notifications

Announcements

No record found.

Power Platform | Extend your Environment strategy by data policies

I was recently talking and sharing experiences around the importance of having a scaleable environment strategy these days. This due to new features being added by the product team that may become part of specific company requirements, such as the recent EU Data Boundary. A Power Platform environment being a space to store, manage, and share your organization’s business data, apps, chatbots, and flows. It also serves as a container to separate use-cases that might have different roles, security requirements, or target audiences. Based on this definition we could easily understand environments being the foundation of your data strategy as well. So why do I come up with this topic?

By 2025, 70% of orgs will need to shift their focus from big data to small and wide data in order to take full advantage of their available data sources

Gartner via Twitter

We have to understand that with each environment in Power Platform there´s data floating around, but also being actively part of an environment (yours using Dataverse or Azure Synapse Link for Dataverse). So you can imagine if we take Gartner´s prediction for granted to focus from big data to small and wide data, we should think of Dataverse being a central part of this focus and therefore an essential for your environment strategy containing data policies.

Power Platform Admin Center – DLP Policies dialog

Talking about Data policies you may first think of the Power Platform Admin Center interface and setting up your policies. Those can be on Tenant level, or individual (on Environment level) based policies. There´s been a lot of improvements made from the UI perspective, such as not only containing the pre-built connectors, instead allowing the control of custom connectors as well. The visual above outlines this is the dialog path to follow.

DLP Policies – Dialog – Custom connectors

This can become quite flexible though intense work managing a couple of environments inside your tenant and understanding the various combination of policies in practice. In fact, I´ve seen many times the DevOps team being contacted because of a DLP policy blocking a Maker´s work from being saved or executed.

Of course you could now consider deploying an environment each time you do have special requirements in terms of either a prebuilt- or a custom connector, but that could easily end up in a couple of environments. So what´s the alternative here? Let´s say you have a shared environment where you set a combo of DLP policies to work in. Let´s continue the journey by saying there´s a special team acting in that environment that would love to use one of the SAP connectors. As the SAP data should be used to enrich existing data from Dataverse you could now consider to modify the DLP policies and allow for using the SAP connector. But you don´t want this to be used by other users of this environment. You are kind of in a dilemma, aren´t you?

Well, there´s an almost unknown feature of DLP policies. What? You´ve missed something? Well, did you know there´s the possibility of DLP resource exemption. It´s available only via PowerShell cmdlet at the moment and I really would love to see this feature becoming available via the UI instead. In the previous link shared, you´ll also find a given example for an app and a flow that you do want to launch by becoming DLP exempt.

Given this example, you can imagine that your data strategy can become both flexible though governed in granular ways. Let me know in your comments, if you´ve been using this DLP feature via PowerShell and your experiences with it. Until then,…


This was originally posted here.

Comments

*This post is locked for comments