Data Factory
Needs VotesConnections should be parameterized to allow for connections across multiple similar connection types
Cyndi Johnson on 06 Jul 2023 20:16:19
In ADF, you can parameterize a linked service so that one linked service could communicate with multiple database servers/databases. In Fabric, you would need a connection for every database. Please make connections parameterized so that working with an enterprise-level multi-tenant source database system would be easier.
- Comments (8)
RE: Connections should be parameterized to allow for connections across multiple similar connection types
A nightmare for the Pro Azure Data Engineers who write metadata driven dynamic pipelines. The only thing is stopping my company to migrate from ADF/Synapse to Fabric. Similarly, the last project I did with my prior company on Azure with Databricks as the pipeline engine they wanted to move Fabric and consulted with me if everything is ready in Fabric to perform the migraiton. I told them to stop until these non-lakehouse connection become dynamic. In the earlier project they had 14000 source servers. In my current one have about 98 since the undelrying schema is usually the same in SAS based it makes more sense to have dynamic connections from sources outside of lakehouse like in ADF. I have not tested copy data or lookup using python but I assume that may work.
RE: Connections should be parameterized to allow for connections across multiple similar connection types
Really need this feature.
RE: Connections should be parameterized to allow for connections across multiple similar connection types
either have dynamic connection like Lakehouse & Warehouse or have parameter settings.ATM.devops is a fake process, we have to change Fabric Pipeline for source connection, I find a way to make destination (Lakehouse or Warehouse) dynamic, but we couldn't work out for source connection (if source is not lake house and warehouse)
RE: Connections should be parameterized to allow for connections across multiple similar connection types
This is a very important feature, even after trying multiple things could not find workaround.
RE: Connections should be parameterized to allow for connections across multiple similar connection types
I am currently developing a metadata-driven pipeline that needs to load data from multiple databases. However, the Copy Activity does not support using parameters for the connections. Once 'dynamic content' is set in the connection, it restricts the connection type to only 'lakehouse', 'KQL database', or 'data warehouse'. This limitation suggests that the Copy Activity can be associated with only one connection at a time. Consequently, if there are over 100 connections, it necessitates creating over 100 Copy Activities. In contrast, Azure Data Factory allows for dynamic changes to the linked service of the Copy Activity, providing greater flexibility and efficiency.
RE: Connections should be parameterized to allow for connections across multiple similar connection types
Definitely needed, as said in some of the other comments if you've got 100 on-prem SQL servers, you'll need 100 pipelines - believe me, I worked at a place with almost that many! Trying to build metadata driven pipeline processing mechanism isn't realistic/practical until you can parameterise the server name and database name. Update... It seems like a little update to Microsoft Fabric has improved this slightly (as of 7th June 2024) where you can add 'dynamic content' for the connection - however, you can only specify the names of Fabric artifacts such as a lakehouse or warehouse... almost there guys, but we need to parameterise the server name too :-)
RE: Connections should be parameterized to allow for connections across multiple similar connection types
+100
RE: Connections should be parameterized to allow for connections across multiple similar connection types
Currently we need to create a pipeline for every SQL server we want to simply copy data from, this is a huge and unrealistic task to build and maintain so many pipelines instead of being able to just parameterize the server and database in a similar way to Azure Data FactoryThis 1 feature being missing necessitates a 100 fold increase in work to ingest data from on prem SQL servers, adding it would enable true metadata driven pipeline ingestion in Fabric