Skip to main content

Notifications

Announcements

No record found.

Business Central data exported in Azure Data Lake

Azure Data Lake?

Azure Data Lake includes all the capabilities required to make it easy for developers, data scientists, and analysts to store data of any size, shape, and speed, and do all types of processing and analytics across platforms and languages. It removes the complexities of ingesting and storing all of your data while making it faster to get up and running with batch, streaming, and interactive analytics. Azure Data Lake solves many of the productivity and scalability challenges that prevent you from maximizing the value of your data assets with a service that’s ready to meet your current and future business needs.

Data Lake | Microsoft Azure

ADSL2 = Azure Data Lake Storage2

Azure Data Lake Storage Gen2 is a set of capabilities dedicated to big data analytics, built on Azure Blob Storage.

Data Lake Storage Gen2 converges the capabilities of Azure Data Lake Storage Gen1 with Azure Blob Storage. For example, Data Lake Storage Gen2 provides file system semantics, file-level security, and scale. Because these capabilities are built on Blob storage, you’ll also get low-cost, tiered storage, with high availability/disaster recovery capabilities.

Designed for enterprise big data analytics

Data Lake Storage Gen2 makes Azure Storage the foundation for building enterprise data lakes on Azure. Designed from the start to service multiple petabytes of information while sustaining hundreds of gigabits of throughput, Data Lake Storage Gen2 allows you to easily manage massive amounts of data.

Azure Data Lake Storage Gen2 Introduction | Microsoft Docs

Capabilities

Create a storage Gen2

To create a storage gen2 follow the steps, you can also convert an existing storage from gen1 to gen2

Create a storage account for Azure Data Lake Storage Gen2 | Microsoft Docs

Example

Premium block blob option

Enable > hierarchical

example

Azure Storage Explorer …also for storage Gen2

it is also possible to browse the storage (gen1 and gen2) from the pc using Azure Storage Explorer

Immagine che contiene testo  Descrizione generata automaticamente

Connect to your resources from Azure Explorer – Blob ADSL Gen2 in this case

Immagine che contiene testo  Descrizione generata automaticamente

Load data into Azure Data Lake Storage Gen2 with Azure Data Factory
How-to load data? follow the examples below the links

Load data into Azure Data Lake Storage Gen2 – Azure Data Factory | Microsoft Docs

Azure Data Factory (ADF) is a fully managed cloud-based data integration service. You can use the service to populate the lake with data from a rich set of on-premises and cloud-based data stores and save time when building your analytics solutions. For a detailed list of supported connectors, see the table of Supported data stores.

Tables in a customer’s Data Lake Storage Gen2 data lake – Dynamics 365 Release Plan | Microsoft Docs

Upgrade your gen1 store to gen2

It’s possible to convert a storage from gen1 to gen2, very simple.

Warning! Page blob are not supported!

Now you can use gen1 and gen2 storages…

Some usefull links

Data Lake Storage for Big Data Analytics | Microsoft Azure

Migrate from Azure Data Lake Storage Gen1 to Gen2 using the Azure portal | Microsoft Docs

General availability: Azure Data Lake Storage Gen1 to Gen2 using Azure Portal | Azure updates | Microsoft Azure

…And now let’s take a look at this pilot program from Microsoft, below you will find (for those who have not seen it) the link to the official presentation video.

Microsoft Pilot Program: an ispiration to export data from Business Central to Azure Data Lake

Microsoft is piloting a new way for Dynamics 365 Business Central customers to host their ERP data in an Azure Data Lake for data warehouse and analytics needs.

.

Data Lake for historicize and analyze Business Central data – bc2adls tool

Managing your data inside the Business Central database is very important, you need your database to keep small to improve the performance. But also, you want to analyze your data within PowerBI.

In this session\video you learn how to extract your data that is inside Business Central to an Azure Data Lake using the bc2adls tool. And how you can maintain your data in there. Next to that, we will show you some tips and tricks using the tool and extend it together with the newly added Data Archive Tool.

Benefits

One of the benefits of using a data lake as a staging area for your BI and reporting is the ability to store disparate data types in a single location. Donahoe also made the point that a data lake strategy does not always replace traditional data warehouses but can augment it with added benefits like simpler access to other Azure data services.

The Business Central team notes three primary use cases for their export strategy:

  • Periodic exports of Business Central data, with the flexibility to define the frequency of recurrence
  • Creating a Business Central data archive to clear older data out of Business Central while maintaining it for audit and historical purposes. This scenario would call on the ability to skip the export of deletions from Business Central.
  • Performing data analysis on Business Central data outside of the operational ERP database. The CDM format in the data lake can be analyzed by Power BI and other services including Azure Synapse.
The Business Case

Comparing with OData APIs

Immagine che contiene testo  Descrizione generata automaticamente

“How-to” Export Data from Business Central to Datalake?
The Microsoft experimental Project “bc2adls” 

The bc2adls tool is used to export data from Dynamics 365 Business Central (BC) to Azure Data Lake Storage and expose it in the CDM folder format. The components involved are the following,

  • the businessCentral folder holds a BC extension called Azure Data Lake Storage Export (ADLSE) which enables export of incremental data updates to a container on the data lake. The increments are stored in the CDM folder format described by the deltas.cdm.manifest.json manifest.
  • the synapse folder holds the templates needed to create an Azure Synapse pipeline that consolidates the increments into a final data CDM folder.

The following diagram illustrates the flow of data through a usage scenario- the main points being,

  • Incremental update data from BC is moved to Azure Data Lake Storage through the ADLSE extension into the deltas folder.
  • Triggering the Synapse pipeline(s) consolidates the increments into the data folder.
  • The data is now ready for consumption by analytics apps like Power BI,
    • via the data.cdm.manifest.json manifest file, or
    • via the raw CSV files for each individual entity inside the data folder

microsoft/bc2adls: Exporting data from Dynamics 365 Business Central to Azure data lake storage (github.com)

Microsoft FULL Video

Following the video at the link below you will be able to export your tables on a gen2 storage, well done, watch it.

Immagine che contiene testo  Descrizione generata automaticamente

https://www.microsoft.com/en-us/videoplayer/embed/RWSHHG

This was originally posted here.

Comments

*This post is locked for comments