- Subscribe to RSS Feed
- Mark as New
- Mark as Read
- Bookmark
- Subscribe
- Printer Friendly Page
- Report Inappropriate Content
Open Dataflow Gen2 without take over
Please make it possible to open a dataflow gen2 without taking it over.
I wish we could open a dataflow gen2 even if it is "owned" by someone else in the workspace.
I think that the user who opens the dataflow gen2 should need to apply their own - or shared - data source connections in order to see any data in the query steps.
Otherwise, they would be able to use the owner's connections to interact with the data sources, and I don't think that's a good idea. Because the owner might not want other workspace members to use his/her connections. So I think that the user who opens the dataflow gen2 should need to apply their own - or shared - data source connections in order to see any data in the query steps.
I think it should be possible to open the item and start editing it, and not having to formally "take over" until you are ready to publish the changes.
Or some times you just want to open it for inspection without making any changes.
And if you do start to edit the dataflow, but then realize you aren't able to authenticate to all the data sources, then it should be possible to cancel so that the dataflow remains unchanged and the previous connections keep working.
Would be nice to have: before a user saves (publishes) changes to the dataflow, there could be a validation step to check that the user has authenticated properly to all of the item's data sources and data destinations, so that they won't cause the dataflow to stop working. If the validation fails, the dataflow should keep running with the previous connections and the changes should be undone.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
- kyle_mueller on: Quickly Identify Table Columns Used in Calculation...
-
Aala_Ali
on: Importing data (or drag and drop) from Fabric Lake...
- anic on: Enhancing Purview Glossary Integration with Power ...
- yeyu47 on: Deployment Pipeline roles
- giusepper11 on: Reintroduce Workspace Name visibility for Lakehous...
-
Koen_Verbeeck
on: Fabric REST API - Allow to specify a folder when c...
-
michaelu1
on: Scheduled refreshes were turned off after two mont...
-
anshulsharma on: Integrate Fabric Eventhouse with Azure AI Agent se...
- david-ri on: Add Key pair based authentication for Snowflake Co...
- tom_vanleijsen on: Hide "updating" spinners in real-time dashboards
- New 15,071
- Need Clarification 6
- Needs Votes 22,636
- Under Review 642
- Planned 269
- Completed 1,650
- Declined 223
-
Power BI
38,795 -
Fabric platform
538 -
Data Factory
446 -
Data Factory | Data Pipeline
292 -
Data Engineering
271 -
Data Warehouse
187 -
Data Factory | Dataflow
155 -
Real-Time Intelligence
126 -
Fabric platform | Workspaces
124 -
Fabric platform | OneLake
120 -
Fabric platform | Admin
115 -
Fabric platform | CICD
89 -
Fabric platform | Capacities
68 -
Real-Time Intelligence | Eventhouse and KQL
60 -
Real-Time Intelligence | Activator
52 -
Fabric platform | Governance
52 -
Fabric platform | Security
49 -
Data Science
48 -
Data Factory | Mirroring
37 -
Databases | SQL Database
32 -
Fabric platform | Support
31 -
Real-Time Intelligence | Eventstream
31 -
Fabric platform | Data hub
28 -
Databases
22 -
Fabric platform | Real-Time hub
3 -
Data Factory | Apache Airflow Job
3 -
Product
2