web
You’re offline. This is a read only version of the page.
close
Skip to main content

Notifications

Announcements

No record found.

Community site session details

Community site session details

Session Id :
Dynamics 365 Community / Blogs / Scrum Dynamics / Data migration in agile Dyn...

Data migration in agile Dynamics 365 projects

Neil Benson Profile Picture Neil Benson 7,369 User Group Leader

Matt Venamore, scrum master at RACQ, asks about how to handle data migration on agile Dynamics 365 projects.
My recommended approach is for the data migration team to conduct a proof-of-concept or prototyping exercise (just like many business apps teams do) to uncover challenges, test their assumptions and learn as much as possible about the data.
Then work closely with the business applications team to iteratively and incrementally prepare the data and the migration procedures.

Resources

Snapshot! for Dynamics CRM/365
XrmToolBox: ERD Visio Builder

Transcript

How do you handle data migration in an agile business applications project?

Welcome to the Amazing Apps show -- for Microsoft Business Applications creators who want to build amazing applications that everyone will love.

Hi, I'm your host Neil Benson. My goal on this show is to help you slash your project budgets, reduce your delivery timelines, mitigate technical risks and create amazing, agile Microsoft Dynamics 365 and Power Platform applications.

The question in this episode comes from Matthew Venamore. Matt’s a scrum master at the Royal Automobile Club of Queensland where we worked together on the implementation of Dynamics 365 for engaging RACQ’s 2 million banking, insurance, and assistance members.

Matt: “Hey, Neil. It’s Matt here. I just thought I’d drop you this little idea or topic of conversation. It’s around the data migration. It’s come to light over Jupiter, as you’re probably aware, that data migration happens only after the build is done and complete. So the data migration is constantly having to change based on the changing requirements and refactoring of data and entity setups, etc. OK. Thanks, mate. Hope you’re doing well. Cheerio!”

In this episode, you’ll learn how I handle data migration in an agile business applications project. You’ll find show notes at customery.com/015.

Matt and I worked together on a large Dynamics 365 project, called Jupiter, where we had multiple teams working together to deliver a new member engagement platform. We had the business & change team, a development team from another ISV, the architecture team, systems integration team, finally the data migration and our Dynamics 365 delivery team.

Data migration efforts have traditionally stuck to the waterfall delivery approach. They gather requirements, design migration solutions, build those solutions, test and validate them, and finally migrate the data out of your legacy systems during a big bang release of Dynamics 365.

This waterfall approach is suitable on projects with small volumes of good quality data and a straight-forward path between the source systems and your Microsoft business application and a generous data migration time slot.

If you’re building a Power App for a team or department and migrating data from spreadsheets, Access or Lotus Notes, then migrating data in a waterfall fashion might be a suitable approach.

If you’re building a complex, enterprise-wide Power Apps app or a Dynamics 365 app then you’re probably going to contend with multiple legacy data sources that have sophisticated, often antiquated, data models, years of historical data quality issues that have been ignored, covered up and worked around.

And you probably get laughed out of the room for requesting a one-week data migration window that involves a shutdown of your production systems. You might get 48 hours. If you’re lucky. Usually over a holiday weekend when your spouse was planning a family trip.

Dealing with this kind of complexity is where an agile approach, like Scrum, really pays off. Scrum is based on empiricism: the theory that says that knowledge can only be acquired through experience. You can analyse the requirements all day long if you like, but you won’t truly know how to migrate the data or long it’ll take until you learn by actually doing it.

To get started, some agile data migration teams run a proof-of-concept or prototype exercise. This is similar to the proof-of-concept or prototype exercises that our application teams might run when evaluating Power Apps or Dynamics 365 designs. They are short, timeboxed exercises run with the objective of testing our assumptions, learning as much as possible, discovering future challenges so that future work is better informed and less risky as a result.

For example, you might run a PoC to evaluate Power Apps dataflows but discover they don’t have an incremental refresh option. Or you might discover that your source SAP system stores organisations and people in the same table (parties) and then you’re going to need to perform additional work to split them into accounts and contacts in Dataverse.

After your PoC, you should have a much better sense of how difficult the migration is going to be, how much effort and funding it’ll take, the technical and business skills required, and where the big challenges lie. It should also create requirements for the data migration team’s product backlog, and perhaps for your business applications team’s product backlog too.

Once your project kicks off then the early sprints can shift towards incrementally discovering, profiling, and assessing your data sources. Usually, data migration teams will translate requirements from the business application team into data quality rules.

For example:

·       Contact Lastname has at least 1 alphabetic character

·       Account Type matches one of these enumerated choices…

The data quality rules can be run again legacy data sources to give you an assessment or score of the quality of the data in them. Some data sources may get rejected altogether; others might require significant remediation.

Forcing the legacy data to fit your business application is usually a bad idea and just kicks the can of bad data down the road, and the can lands and spills its toxic contents into your pristine new business application.

Fixing the data quality issues at the source is usually the better answer. It’s hard because you’ve now made data quality someone else’s problem, and if they don’t care about it as much as you do then you’re stuck. It will take a collaborative approach between legacy system owners, users and your data migration team. These remediation efforts need to be uncovered as early as possible so that your business stakeholders can evaluate their options and take appropriate action.

Later sprints focus on designing, building and running data migration procedures that can be evaluated and tested both from quality and performance viewpoints.

One of the challenges that our data migration team had on the Jupiter project that Matt and I worked on is that the target data model in Dataverse isn’t fixed. The data migration team must frequently refine their data migration procedures to respond to tweaks in our data model. It might be a small change like changing a column’s data type or a more significant change like redesigning a series of related tables that included calculated columns.

The key here is transparency, and frequent inspection and adaptation. That sounds a lot like the characteristics of Scrum, right. The data migration team is invited to attend the business application team’s sprint reviews, and the data migration team’s requirements are considered during the business apps team’s sprint planning.

We updated our data model documentation towards the end of every sprint using Snapshot for Dynamics 365, but you could also use the data model document generation tools within the XrmToolBox. We kept our models published on wiki pages monitored by the data migration team so that they received notifications when updates were available.

During our first release to production, during which all the customer data was migrated from the legacy sources, the Dynamics 365 team turned off all the automations like plugins, workflows and LogicApps. Any calculated data was pre-calculated by the data migration team in their scripts; all those calculations had been identified by the Dynamics 365 team and created as requirements on the data migration team’s backlog.

Through these, and a bunch of iterative performance tuning runs, the data migration team was able to migrate over 2 million customers, which was about 1TB of data, into Dynamics 365 in about 24 hours during our initial release.

Matt, I hope my recollection of how our Dynamics 365 and data migration teams worked together on the Jupiter project.

I hope you, my Amazing Apps show listener, found it useful too. If you did, remember to subscribe to the Amazing Applications podcast on your favourite podcast player so that the latest episodes get squirted directly into your ears as soon as they’re published.

If you have a challenge adopting an agile approach to building amazing business applications, you can leave a voicemail, just like Matt did, at customery.com and click on the ‘Send Voicemail’. I’ll do my best to share my experience and insights with you. There are several more Q&A episodes lined up, and some more interview shows with the people building amazing applications coming early next year.

I’ll see you next time. Until then, Keep sprinting.


This was originally posted here.

Comments

*This post is locked for comments