Fabric platform
NewShow consumed capacity units per notebook/lakehouse/dataflow/pipeline execution in Monitoring Hub
Martin Bode on 07 Mar 2024 20:24:45
I'd love to see how many CUs (capacity-unit-seconds) were consumed by each activity listed in Monitoring Hub.
- Comments (4)
RE: Show consumed capacity units per notebook/lakehouse/dataflow/pipeline execution in Monitoring Hub
Same here, we also need a realtime consumed capacity units to identify high workloads and move workspaces to a capacity with more CUs available, we have a F64 and a F256 Capacity and if the F64 is using around 90% i wanna switch some workspaces to a capacity with more CU available via automation to ensure we are not reaching >100% because of extra payment...
RE: Show consumed capacity units per notebook/lakehouse/dataflow/pipeline execution in Monitoring Hub
This would be really helpful. In the capacity metrics app, you can see how much was used in aggregate, but I have some pipelines with dynamic options and sometimes they run long, sometimes short. Right now I can't clearly identify the cost difference between the two,
RE: Show consumed capacity units per notebook/lakehouse/dataflow/pipeline execution in Monitoring Hub
Great idea! This would really help in real-time decision-making when choosing which item to use for a job (data pipeline, dataflow gen2, notebook, etc.). Likewise, it would be helpful for tuning the settings and code within an item.
RE: Show consumed capacity units per notebook/lakehouse/dataflow/pipeline execution in Monitoring Hub
I agree.we must have a real time and a per activity (in the monitoring hub) the CU consumed.and also the estimated burn down