Skip to main content

Data Factory

Needs Votes

Connections should be parameterized to allow for connections across multiple similar connection types

Vote (135) Share
Cyndi Johnson's profile image

Cyndi Johnson on 06 Jul 2023 20:16:19

In ADF, you can parameterize a linked service so that one linked service could communicate with multiple database servers/databases. In Fabric, you would need a connection for every database. Please make connections parameterized so that working with an enterprise-level multi-tenant source database system would be easier.

Comments (7)
Cyndi Johnson's profile image Profile Picture

Tommy Croteau on 22 Nov 2024 16:55:48

RE: Connections should be parameterized to allow for connections across multiple similar connection types

Really need this feature.

Cyndi Johnson's profile image Profile Picture

Scott Lu on 19 Nov 2024 00:39:22

RE: Connections should be parameterized to allow for connections across multiple similar connection types

either have dynamic connection like Lakehouse & Warehouse or have parameter settings.ATM.devops is a fake process, we have to change Fabric Pipeline for source connection, I find a way to make destination (Lakehouse or Warehouse) dynamic, but we couldn't work out for source connection (if source is not lake house and warehouse)

Cyndi Johnson's profile image Profile Picture

Kamal Sanguri on 16 Sep 2024 12:50:30

RE: Connections should be parameterized to allow for connections across multiple similar connection types

This is a very important feature, even after trying multiple things could not find workaround.

Cyndi Johnson's profile image Profile Picture

Henry Chan on 15 Jul 2024 21:14:21

RE: Connections should be parameterized to allow for connections across multiple similar connection types

I am currently developing a metadata-driven pipeline that needs to load data from multiple databases. However, the Copy Activity does not support using parameters for the connections. Once 'dynamic content' is set in the connection, it restricts the connection type to only 'lakehouse', 'KQL database', or 'data warehouse'. This limitation suggests that the Copy Activity can be associated with only one connection at a time. Consequently, if there are over 100 connections, it necessitates creating over 100 Copy Activities. In contrast, Azure Data Factory allows for dynamic changes to the linked service of the Copy Activity, providing greater flexibility and efficiency.

Cyndi Johnson's profile image Profile Picture

Rocket Porg on 07 Jun 2024 11:30:42

RE: Connections should be parameterized to allow for connections across multiple similar connection types

Definitely needed, as said in some of the other comments if you've got 100 on-prem SQL servers, you'll need 100 pipelines - believe me, I worked at a place with almost that many! Trying to build metadata driven pipeline processing mechanism isn't realistic/practical until you can parameterise the server name and database name. Update... It seems like a little update to Microsoft Fabric has improved this slightly (as of 7th June 2024) where you can add 'dynamic content' for the connection - however, you can only specify the names of Fabric artifacts such as a lakehouse or warehouse... almost there guys, but we need to parameterise the server name too :-)

Cyndi Johnson's profile image Profile Picture

Erwin de Kreuk on 28 May 2024 15:10:31

RE: Connections should be parameterized to allow for connections across multiple similar connection types

+100

Cyndi Johnson's profile image Profile Picture

Jonathan Flint on 24 Apr 2024 11:26:13

RE: Connections should be parameterized to allow for connections across multiple similar connection types

Currently we need to create a pipeline for every SQL server we want to simply copy data from, this is a huge and unrealistic task to build and maintain so many pipelines instead of being able to just parameterize the server and database in a similar way to Azure Data FactoryThis 1 feature being missing necessitates a 100 fold increase in work to ingest data from on prem SQL servers, adding it would enable true metadata driven pipeline ingestion in Fabric