Skip to main content

Data Factory

Needs Votes

Add Dataverse as a Destination for Dataflows and Pipelines

Vote (2) Share
R T's profile image

R T on 09 May 2024 17:24:01

Fabric offers a lot of power on the analytical front, but application CRUD/writeback still requires a separate database/data store. The Dataverse is the obvious low-code/no-code "database" for the Power Platform, but Fabric doesn't currently feature any sort of native integration for ETL/ELT with the Dataverse. For low-code/no-coders, this is incredibly limiting as there are often cases where you'd want to leverage the orchestration capabilities of Fabric to deposit data in an application-friendly data store so the app could perform writeback operations before the final data gets loaded into Fabric.


EXAMPLE SCENARIO:

You have a Power App that manages customer record data stored in a Dataverse table. You have files flowing into Fabric that contain some customer data that you would need to insert into the Dataverse tables so users can process that data via the Power App. You then want to be able to use the Dataverse data in Fabric for analytics.


IDEAL STATE:

Have the ability to configure a dataflow to insert new records into a Dataverse table. Users can then perform updates using the Power App. These updates would then be shown in Fabric via the existing Analyze in Fabric feature that would link your Dataverse data back into Fabric so that app-related updates can be reflected in your data warehouses and lakehouses.