1-10 of 90 results
Miguel Escobar on 2023-05-31 20:13:25
Enable an experience that allows the end-user to setup incremental refresh for Dataflows Gen2 in Data Factory for Microsoft Fabric
Incremental Refresh support for Dataflows Gen2 is planned for second half of 2023. Stay tuned!
Miguel Escobar on 2023-06-01 17:44:52
Enable to modify the mashup document of a Dataflow Gen2 by passing a variable from a pipeline before an execution of the Dataflow
Planned for the first half of Calendar year 2024
Darwin Dat on 2023-06-28 18:34:34
As we have many data sources from on-prem (SQL Server), it would be great if the Pipeline in Data Factory within Microsoft Fabric can support the on-premise data gateway, so we could integrate data easily from on-prem to Onelake. Otherwise, as the Copy Activity in Fabric seems much easier to u...
We are working on bringing on-premises data gateway for dataflow to enable Pipeline on-prem connectivity.
Alexis Molteni on 2023-07-14 16:06:37
Support UPSERTs and DELETEs when copying data into Lakehouse Tables from Pipeline copy activity, as opposed to Appending new rows
be able to do incremental loading from a pipeline.
Upsert to Lakehouse Tables via Pipeline copy activity is in roadmap.
Takeshi Kagata on 2023-08-21 03:10:43
When using the Dataflow g2 feature to ingest data into a Lakehouse or Warehouse (create a new table and ingest), it would be nice to have a UI to check and set "Allow nulls" for columns.Dataflow g1 and the Power Query editor in the desktop environment, I didn't have much concern about t...
Erwin de Kreuk on 2023-07-17 16:06:14
Currently, the ability to dynamically invoke a pipeline is missing in an Invoke Pipeline activity. This makes building meta data driven framework much easier. A similar option is already available for a Notebook in Azure Synapse.
Cyndi Johnson on 2023-07-06 20:16:19
Connections should be parameterized to allow for connections across multiple similar connection types
In ADF, you can parameterize a linked service so that one linked service could communicate with multiple database servers/databases. In Fabric, you would need a connection for every database. Please make connections parameterized so that working with an enterprise-level multi-tenant source dat...
Kevin Doherty on 2023-06-05 15:32:16
Please expand the selection of connectors for the Copy Data activity in Data Factory to match what is available in Azure Data Factory. Currently very limited options.
Connectors supported in ADF today will be available in Fabric Pipelines.
Tentative ETA for below connectors:
- FTP and SFTP – shipped
- MongoDB - Nov
- MongoDB Atlas - Nov