Skip to main content
Post a question

Notifications

Announcements

No record found.

Community site session details

Community site session details

Session Id : yzCY0FEz8w8wh7EP+bPT15
Microsoft Dynamics 365 | Integration, Dataverse...
Suggested answer

Real time Journey data

Like (5) ShareShare
ReportReport
Posted on 12 Mar 2025 11:23:43 by 10
I want to know the tables of Real-Time Journey data into Dynamic 365 and how can we take them into Fabric Lakehouse?
 
I want to know the names of the table .
  • JAIN Profile Picture
    on 22 Mar 2025 at 10:53:54
    Real time Journey data
     
    We are looking to obtain real-time website tracking data, including information such as Session ID, etc. Could you please advise on which tables contain this information?
     
     
    Thank you
  • Suggested answer
    Daivat Vartak (v-9davar) Profile Picture
    3,417 Super User 2025 Season 1 on 18 Mar 2025 at 03:03:38
    Real time Journey data
    Hello NK-12031121-0,
     
    You're looking to extract Real-Time Journey data from Dynamics 365 Customer Insights - Journeys (formerly Real-Time Marketing) into Fabric Lakehouse. This involves identifying the relevant Dataverse tables and setting up the data pipeline.
     
    Understanding Real-Time Journey Data:
    Real-Time Journey data in Dynamics 365 is stored in Dataverse tables related to customer interactions, journey executions, and analytics. Here are the key tables you'll likely need:
     
    Key Dataverse Tables for Real-Time Journeys:
    • msdynmkt_customerjourney (Customer Journey):
      • Contains the definition and metadata of your Real-Time Journeys.
      • Provides information about journey goals, segments, and steps.
    • msdynmkt_customerjourneyinstance (Customer Journey Instance):
      • Stores instances of journey executions.
      • Tracks the progress of individual contacts through a journey.
    • msdynmkt_customerjourneyactivity (Customer Journey Activity):
      • Records the activities performed within a journey (e.g., sending an email, triggering a custom event).
      • Provides details about activity outcomes and timings.
    • msdynmkt_emailengagement (Email Engagement):
      • Tracks email engagement metrics (opens, clicks, bounces).
      • Provides insights into email performance.
    • msdynmkt_textmessageengagement (Text Message Engagement):
      • Tracks text message engagement metrics(deliveries, clicks, ect).
    • msdynmkt_pushnotificationengagement (Push Notification Engagement):
      • Tracks push notification engagement metrics(deliveries, clicks, ect).
    • msdynmkt_contactpoint (Contact Point):
      • Stores information about contact points used in journeys (e.g., email addresses, phone numbers).
    • Contact (Contact):
      • Stores contact information, which is essential for identifying and segmenting customers.
    • msdynmkt_segment (Segment):
      • Stores segment definitions used in journeys.
    • msdynmkt_segmentmembership (Segment Membership):
      • Tracks contact membership in segments.
    • msdynmkt_customchannelactivity (Custom Channel Activity):
      • Stores data from custom channels that have been setup.
     
    Extracting Data into Fabric Lakehouse:
    Here's how you can move this data into Fabric Lakehouse:
    1. Azure Synapse Link for Dataverse:
      • This is the recommended approach for near real-time data replication from Dataverse to Azure Data Lake Storage Gen2 (which Fabric Lakehouse uses).
      • Enable Synapse Link:
        • In the Power Platform admin center, enable Azure Synapse Link for Dataverse.
        • Select the Dataverse environment containing your Real-Time Journey data.
        • Choose the tables you want to export (the ones listed above).
        • Configure the Azure Data Lake Storage Gen2 account that will store the data.
      • Fabric Lakehouse:
        • Once the Dataverse data has landed in Azure Data Lake Storage Gen2, you can then connect your Fabric Lakehouse to that storage account.
        • You can then create shortcuts, or use copy activity's within data pipelines to bring the data into the lakehouse tables.
    2. Power Automate (For Incremental Updates):
      • If you only need incremental updates or specific data transformations, you can use Power Automate.
      • Dataverse Connector: Use the Dataverse connector to query the relevant tables.
      • Lakehouse Connector: Use the Lakehouse connector to insert/update data in your Lakehouse tables.
      • Limitations: Power Automate is best for smaller datasets and less frequent updates.
    3. Azure Data Factory or Synapse Pipelines:
      • For more complex data transformations and scheduled batch processing, use Azure Data Factory or Synapse Pipelines.
      • Dataverse Connector: Use the Dataverse connector to extract data.
      • Lakehouse Connector: Use the Lakehouse connector to load data into the Lakehouse.
      • Data Transformation: Use Data Factory/Synapse transformation activities to clean, transform, and enrich the data.
     
    Key Considerations:
    • Data Volumes: Real-Time Journey data can be voluminous, especially for high-traffic journeys. Consider data partitioning and optimization techniques.
    • Data Relationships: Understand the relationships between the Dataverse tables to ensure data integrity in your Lakehouse.
    • Incremental Updates: Implement incremental update strategies to avoid full data refreshes, which can be time-consuming.
    • Security: Secure your Azure Data Lake Storage Gen2 account and Fabric Lakehouse to protect sensitive customer data.
    • Performance: Monitor the performance of your data pipelines and optimize them as needed.
     
    By using Azure Synapse Link for Dataverse and Fabric Lakehouse, you can create a robust data pipeline for analyzing Real-Time Journey data.
     
    If my answer was helpful, please click Like, and if it solved your problem, please mark it as verified to help other community members find more.
    If you have further questions, please feel free to contact me.
     
    My response was crafted with AI assistance and tailored to provide detailed and actionable guidance for your Microsoft Dynamics 365 query.
     
    Best Regards,
    Daivat Vartak

Under review

Thank you for your reply! To ensure a great experience for everyone, your content is awaiting approval by our Community Managers. Please check back later.

Helpful resources

Quick Links

Daivat Vartak – Community Spotlight

We are honored to recognize Daivat Vartak as our March 2025 Community…

Announcing Our 2025 Season 1 Super Users!

A new season of Super Users has arrived, and we are so grateful for the daily…

Kudos to the February Top 10 Community Stars!

Thanks for all your good work in the Community!

Leaderboard

#1
André Arnaud de Calavon Profile Picture

André Arnaud de Cal... 293,274 Super User 2025 Season 1

#2
Martin Dráb Profile Picture

Martin Dráb 231,947 Most Valuable Professional

#3
nmaenpaa Profile Picture

nmaenpaa 101,156 Moderator

Leaderboard

Product updates

Dynamics 365 release plans
Loading started