Skip to main content

Show consumed capacity units per notebook/lakehouse/dataflow/pipeline execution in Monitoring Hub


I'd love to see how many CUs (capacity-unit-seconds) were consumed by each activity listed in Monitoring Hub.

Read more...
0 Comments

Read more...
0 Comments

STATUS DETAILS
New

Notebook code editor: colorize and highlight parenthesis / bracket pairs


Currently the is no help for develops, highlighting opening and closing parenthesis pairs.

PySpark code usually contains lots of parenthesis; finding the missing parenthesis can be a struggle.

I'd appreciate a similar editing experience like in Databricks where parenthesis pairs ha...

Read more...
0 Comments

Read more...
0 Comments

STATUS DETAILS
New

Conditional retry for ADF pipeline activities


ADF already has a feature for retrying pipeline activities on error.

The problem that we face here is that a retry only makes sense if there are transient (mostly network) problems.

But if there is an error with the data, we can't simply retry the activity since we need to fix the ...

Read more...
0 Comments

Read more...
0 Comments

STATUS DETAILS
Backlog