Connections should be parameterized to allow for connections across multiple similar connection types

In ADF, you can parameterize a linked service so that one linked service could communicate with multiple database servers/databases. In Fabric, you would need a connection for every database. Please make connections parameterized so that working with an enterprise-level multi-tenant source database system would be easier.

Needs Votes
Comments
jonathan_flint
New Member

Currently we need to create a pipeline for every SQL server we want to simply copy data from, this is a huge and unrealistic task to build and maintain so many pipelines instead of being able to just parameterize the server and database in a similar way to Azure Data Factory


This 1 feature being missing necessitates a 100 fold increase in work to ingest data from on prem SQL servers, adding it would enable true metadata driven pipeline ingestion in Fabric

Erwin_de_Kreuk
New Member

+100

ben_riley
New Member

Definitely needed, as said in some of the other comments if you've got 100 on-prem SQL servers, you'll need 100 pipelines - believe me, I worked at a place with almost that many! Trying to build metadata driven pipeline processing mechanism isn't realistic/practical until you can parameterise the server name and database name. Update... It seems like a little update to Microsoft Fabric has improved this slightly (as of 7th June 2024) where you can add 'dynamic content' for the connection - however, you can only specify the names of Fabric artifacts such as a lakehouse or warehouse... almost there guys, but we need to parameterise the server name too 🙂

Henry_Chan
New Member

I am currently developing a metadata-driven pipeline that needs to load data from multiple databases. However, the Copy Activity does not support using parameters for the connections. Once 'dynamic content' is set in the connection, it restricts the connection type to only 'lakehouse', 'KQL database', or 'data warehouse'. This limitation suggests that the Copy Activity can be associated with only one connection at a time. Consequently, if there are over 100 connections, it necessitates creating over 100 Copy Activities. In contrast, Azure Data Factory allows for dynamic changes to the linked service of the Copy Activity, providing greater flexibility and efficiency.

Kamal_Sanguri1
New Member

This is a very important feature, even after trying multiple things could not find workaround.

Scott_Lu
New Member

either have dynamic connection like Lakehouse & Warehouse or have parameter settings.

ATM.devops is a fake process, we have to change Fabric Pipeline for source connection, I find a way to make destination (Lakehouse or Warehouse) dynamic, but we couldn't work out for source connection (if source is not lake house and warehouse)

Tommy_Croteau
New Member

Really need this feature.

Azhar_Maqsood
New Member

A nightmare for the Pro Azure Data Engineers who write metadata driven dynamic pipelines. The only thing is stopping my company to migrate from ADF/Synapse to Fabric. Similarly, the last project I did with my prior company on Azure with Databricks as the pipeline engine they wanted to move Fabric and consulted with me if everything is ready in Fabric to perform the migraiton. I told them to stop until these non-lakehouse connection become dynamic. In the earlier project they had 14000 source servers. In my current one have about 98 since the undelrying schema is usually the same in SAS based it makes more sense to have dynamic connections from sources outside of lakehouse like in ADF. I have not tested copy data or lookup using python but I assume that may work.

fbcideas_migusr
New Member
Status changed to: Needs Votes
 
willem_oam
Regular Visitor
We're dealing with 100+ on-prem databases and a pipeline for each... We update these pipelines via the API, but that is rate-limited. All-round, very not great experience.