D365 Finance & Operations ISV Solution Development Best Practices (Part-1) - Develop & Customize Design
In this blog series we will cover key decision points and best practices in development and maintenance lifecycle of ISV solution for Dynamics 365 Finance & Operations (F&O). The intent of this blog series is to provide best practice recommendations to to potential ISV solution developers to evaluate during their Design, Development and Maintenance product journey.
ISV new to Dynamics 365 Ecosystem:
D365 FO / F&O refers to Dynamics 365 Business Suite of products such as D365 Finance, D365 SCM and D365 Commerce. In case you are doing Please take a moment and go through Service description for Finance & Operations apps which helps you holistically familiarize with D365 F&O product family, its operating model and service activity responsibilities between Microsoft & Customer.
A successful ISV solution development requires alignment in terms of development, architecture framework acceptance and blending with base Business Application of D365 F&O. A ISV bringing in a key feature / enhancement into whitespaces where partner can Extend / Connect or Build on top of Dynamics 365 Business Application suite. These alignments by taking right steps will help ISV solution to future proof their investment in terms of quality stage gates of Application publication, maintenance, and updates.
Components of Develop & Customize Design & Recommended Practices:
When developing an ISV solution / functionality, focus often goes into developing the ideated use cases (core functionality) to fill the functionality gap or exchange data of the integration. The scalable and successful ISV must cater to the aspects related to selecting the right approach of development and catering the scaffolding of a true cloud native, response, and nimble application for ease in implementation and support activities down the road. Below are key areas as an architect group of products to brainstorm to select the appropriate choices for blueprinting a successful product.
- Models and Packages: In development of D365 F&O in Visual Studio. A model is a group of elements, such as metadata and source files, that typically constitute a distributable software solution and includes customizations of an existing solution. A model is a design-time concept, for example a warehouse management model or a project accounting model. A model always belongs to a package. A package is a deployment and compilation unit of one or more models. It includes model metadata, binaries, and other associated resources. One or more packages can be packaged into a deployable package, which is the vehicle used for deployment on runtime environments.
- MVP Approach:
- Always plan for right set of models for your solution. It depends on functionality of the ISV solution. E.g., if ISV solution composed of 3-10 forms, handful of reports, 5-10 classes and few data entities one model might be enough for MVP.
- It is also recommended if ISV solution is mostly a connect / integration with 3rd party and ISV need to deploy some support functionality than for deployment simplicity keep 1 model.
- ISV with Composable Functionalities:
- If you are building a ISV solution composing of functionalities which can be deployable in composable way like each functionality can be deployable and useable separately, It is recommended to have it in multiple models as follows so you have composable nature where you can deploy subset or all models on different projects.
- ISV-Base Model (Carrying common classes, ISV licensing logic, Utility Classes)
- ISV-Functionality A
- ISV-Functionality B
- ISV-Functionality N
- If you are building a ISV solution composing of functionalities which can be deployable in composable way like each functionality can be deployable and useable separately, It is recommended to have it in multiple models as follows so you have composable nature where you can deploy subset or all models on different projects.
- ISV with Country Specific Localizations:
- If you are planning to make your ISV that can have base solution and industry / customer specific or country specific localization add-ons which could be optional for deployments. In that case following model strategy can be used.
- SV-Base Model
- ISV-Core Functionality Model (It carries your main ISV solution)
- ISV-Industry X Specific Enhancements Model
- ISV-Industry Y Specific Enhancements Model
- ISV-Country A Localization Model
- ISV-Country B Localization Model
- If you are planning to make your ISV that can have base solution and industry / customer specific or country specific localization add-ons which could be optional for deployments. In that case following model strategy can be used.
- MVP Approach:
- Forms & Business Logic Development: Extending base functionality in D365 F&O can be achieved by following extension model framework of D365. For complete development guide you can refer to following documentation.
- Where to place menu items?
- D365 F&O composed of many modules and Menu and Sub Menu sections. If ISV is making a functionality which has many forms, independent functionality and reports, it is recommended to create a separate module which has its own workspaces, Forms, Reports and Setup Areas.
- If the ISV solution has extension functionality means it append existing functionalities (forms) than we recommend in each module add a separate sub area in the module with your ISV Solution name and add Forms / Parameter pages in it. While extending the existing forms, better to place all fields of your ISV solution in a separate field group / tab on main form. On Menu ribbon it is also better to make your own ISV solution sub-area and put your action menu items there.
- These clear pattern of menu placements helps extensibility, keep training and readability for end customer simple.
- Form Creations:
- D365 F&O forms based on purpose like Workspace pattern, Details Master pattern, Wizard form pattern is recommended to follow and apply relevant form pattern. Please refer to following link for guidelines on Form patterns.
- Classes / Batch Jobs:
- Always take re-usable functions, re-usable business logic into classes so it can be accessed from multiple endpoints. In case there is long processing required such as updating transactions, create transactions based on certain computation, it is recommended to create batch job which can run on demand / pre-scheduled time.
- Try / Catch exception blocks and TTSBegin / TTSCommit blocks as short as possible.
- While writing code use Recordset operations for insertions / updations instead of individual records.
- Refer to Develop and Customize page for more guidelines.
- Naming Convention:
- Please refer to naming guidelines for extension for more details. More information on ISV development resources can be found here.
- Where to place menu items?
- Data Modeling: A basic concept to learn about physical data tables can be learned in build tables in Finance and Operations apps. Most important interaction of data exchange internally or for exchange with outside applications is done via Data Entities. A data entity is an abstraction from the physical implementation of database tables. For example, in normalized tables, a lot of the data for each customer might be stored in a customer table, and then the rest might be spread across a small set of related tables. In this case, the data entity for the customer concept appears as one de-normalized view, in which each row contains all the data from the customer table and its related tables. A data entity encapsulates a business concept into a format that makes development and integration easier. The abstracted nature of a data entity can simplify application development and customization. Later, the abstraction also insulates application code from the inevitable churn of the physical tables between versions. To summarize: Data entity provides conceptual abstraction and encapsulation (de-normalized view) of underlying table schemas to represent key data concepts and functionalities.
- Tables:
- For tables define indexes on tables where high volume transactions are expected.
- Do not enable auditing fields CreatedOn, CreatedBy, ModifiedBy and ModifiedOn on transaction tables unless it is absolutely necessary. There is performance hit of this operation.
- Data Entities:
- Use creation of data entities for integration / data import / export. A data entity should provide a holistic object that encapsulates the relevant business logic in a single consumable contract. The contract is then exposed through application interfaces (APIs), such as OData, import and export, integration, and the programming model. More details can be found at design principles and best practices for data entities.
- Have familiarity with OData data exchange concepts and familiar with nomenclature of API URIs and supported operations. The following article can serve for quick refresher on OData Capabilities of D365 Finance & Operations APIs.
- Tables:
- Integration Patterns & Options: D365 F&O offer many options to integrate / connect and exchange data with 3rd party applications. It is pivotal in your ISV connect journey to select the rightly suitable mechanism for integrating first party application with external apps. Typical data points for this decision includes the whether requirement is Realtime, near Realtime or batch integration, transactional volume like transaction by transaction or bulk, deterrence, message queuing needs as well is it point to point integration or event publisher & subscriber need to name a few consideration points. We are providing some suggestions on which cases a certain approach is useful.
Measure | OData APIs | Batch Data APIs | Power Platform / Dataverse | Custom Services |
Sync. vs Async. | Suitable for real-time / synchronous integrations | Suitable for Asynchronous and batch (bulk) records | Mostly near real-time | Primarily used to trigger some busines process action and passing a payload request record |
Data Volumes | Few 100 calls / Hour, if there are complex OData entities like Sales Order, Purchase Order than under 100 calls / Hour are suitable here | For heavy volumes DIXF based integrations batch data APIS are suitable all requests for integration packages can be enqueued, monitored via APIs | Relatively newer integration option compare to others I advise caution in selecting this, use this option where data sync, data entity involved is stable without data errors and use case scenario is suitable for relatively few seconds to few minutes lag in data sync. |
Suitable for low volumes, if frequent execution required try creating a batch job within D365 F&O |
Error Handling & Logging | Caller app to take care certain error handling and logging details OData call just provide exception details | DIXF has data management workspace which provide basic logging, retry, error visibility | Dataverse has extensive mode in place to provide actionable logs, error reporting | Caller app to take care certain error handling and logging details |
Development Effort | Suitable to consume OData via any programming language / your LOB ISV solution to directly call and consume | Use of LogicApps workflow is recommended to sequence the workflow and execute the logic app by picking the files and get the package executed and get results this workload can run independently from base app | Low Code / No Code platform, connecting possible via many connectors and triggers. Suitable if ISV see more alignment down the road with other D365 products as well to use this consistent approach across solutions | Suitable to consume via any programming language / your LOB ISV solution to directly call and consume |
Retry Mechanism | Caller app has to manage | LogicApp can have retry logic or this can be orchestrated from LogicApp / Azure message bus to retry | Power Automate / Dataverse can have retry logic built in to cater basic scenarios | Caller app has to manage |
- How to capture D365 F&O data events:
D365 F&O offer Business Events a framework where certain events such as sales order invoice, workflow actions and many others can be captured near Realtime via batch job. Out of box a catalog is available for business events however in many use cases developers must extend using business events framework to start generating required events. Microsoft recently extends Business events concept on all OData based data entities using Data Events. Data events are events that are based on changes to data in Finance and Operations apps. Create, update, and delete (CUD) events can be enabled for each entity. For example, if the Create event is enabled for the Purchase order headers V2 entity, an event notification is emitted every time that a new purchase order is created in the database. All standard and custom entities in Finance and Operations apps that are enabled for Open Data Protocol (OData) can emit data events. In the data event catalog, each event for an entity is listed as a data event that subscriptions can be established for. The concept of activating the data event and associating it with an endpoint resembles the concept of business events. When a data event occurs, the payload of the event contains the corresponding entity record. For more detail on how to consume data event please read this article "Generate CUD Data Events in D365 Finance & Operations and Capture in 3rd Party Applications".
- Dashboard & Reporting: Evaluate reporting needs of ISV solution reporting needs which could range from transactional reporting, dashboard reporting for specific roles all the way to providing aggregate data model for adhoc. end user reporting. You can refer to Creating Reporting Solutions guide to evaluate and select appropriate reporting tool for your reporting needs.
- Transactional Reports: For D365 F&O and ISV only data it is better to make SSRS reports internally within D365 F&O.
- MVP Phase: If data require mashup from multiple sources and D365 does not generate lot of data which is required for reporting than instead of using Data Lake it is better to use virtual entity pattern of D365 F&O and Dataverse and access the data from Dataverse and create report on it.
- Enterprise Customers / Sizeable ISV Solution: If solution is extensive and potential customers usually are enterprise than better to build dashboards / BI reporting on data which is getting exported to Data Lake.
- Application Insights / Telemetry: Does the application require to collect telemetry data for feature usage, stack flows etc. for logging / debugging purpose? If that is the case, is application using appropriate logging mechanism. Please go through Application Tracker – Code Quality Tooling article for more details. Get yourself familiar with monitoring & diagnostic tools and telemetry available with D365 F&O. This will help you define right level of telemetry needs and how to access information during actual implementations.
- Data Scripts / Seed Data: Plan as per need do you need to get the system ready with some configuration / initial data when deploying the ISV solution or implementing at actual customers. There could be possibility to create data import templates for collect and upload data needs into the system. One way to achieve this is to create DIXF import projects or Process packages. ISV also has to plan the detail on creating and consuming data packages can be referred here.
- If there are lot of configuration data to be setup in setting up the D365 F&O to use ISV solution, than creating and consuming data packages is essential part to speed up solution implementation.
- Use by default provided data packages accessible by LCS shared asset libraries for modules as applicable.
- Use Data Management workspace in Systems Administration module to check out module based DIXF data import and export templates and on same lines build new for your ISV solution.
- Securing Sensitive Credentials: Any credentials, sensitive information (key value pairs) or credentials for integration, application need to be maintained in a secure way. D365 F&O uses Azure Key Vault to store sensitive information and use it for leveraging scenarios requiring secured information access. This document explains how to use Azure Key Vault to enable this access in D365 F&O apps.
- Do not put any server name, URL, connection string in code or parameters screens use Azure Key Vault to store this information and retrieve security in code as per above example.
- In case certain fields contain sensitive data which you like to protect access to certain user groups, on those table fields use "AOS Authorization" set to Yes and use Table Permission Framework to map exclusive privileges to the user roles.
- POS Development:
- Use new Commerce SDK for development instead of legacy retailSDK as it will be phased out in coming months to year timeframe.
- Align your development to principles of Store Commerce App.
- Design the functionality based on extension points available in D365 Store Commerce App such as view extensions UI modification extensibility points, Request handlers, available triggers etc.
- Use operations to perform custom business logic.
- Dialog designing look n feel to match existing POS dialogs.
- Keep POS extension package list in headless commerce ExtensionPackageDefinitions instead of local extensions.json file to avoid code merging from multiple ISVs / Custom solutions.
- Evaluate solution to implement via PowerApp or Power Automate instead of POS component in order to keep deployment footprint smaller.
- Evaluate Task Management process if using that can streamline the process of task to be done instead of customizing POS.
Summary:
We compiled in this blog few design and development considerations when building an ISV solution. Always refer to D365 F&O development and extension page for latest features and updates. Evaluate above suggestions in the context of your solution future roadmap and subsequent builds. Following some of above suggestions will help in keeping maintenance cost low and build solution on right foot for alignment with D365 F&O Application Lifecycle Management practices.
Full Blog Series Links:
- D365 Finance & Operations ISV Solution Development Best Practices (Part-1) - Develop & Customize Design
- D365 Finance & Operations ISV Solution Development Best Practices (Part-2) - Environment Planning & Code Repository
- D365 Finance & Operations ISV Solution Development Best Practices (Part-3) - Testing Strategy & Release Updates
- D365 Finance & Operations ISV Solution Development Best Practices (Part-4) - Customer Implementation Readiness
References:
*This post is locked for comments