Manjunatha MC on 15 Jan 2025 12:02:04
I have many dataflows (250) scheduled for refresh at different time intervals. Each 30 mins interval 10 dataflows are scheduled for refresh with source used as Sharepoint Online Excel/CSV. 2 or 3 dataflows refresh is failing with below error of Web data source.
We are planning to schedule the dataflow refresh for every 5 mins interval for each dataflow which reduces the load on Sharepoint Online Excel/CSV, so that dataflow refresh failure does not happen. As we have only 30 min interval schedules for dataflow schedule refresh. We are requesting the below features to the dataflow refresh schedule.
Features
Please add 5 minutes refresh interval for scheduled dataflow refresh.
Please enable email trigger for On-demand dataflow refresh activity through Pyspark code on-demand/manual dataflow refresh or manual refresh from the workspace.
Error :
custom_dealsinfoissue_WriteToDataDestination: There was a problem refreshing the dataflow: 'Couldn't refresh the entity because of an issue with the mashup document MashupException.Error: DataSource.Error: Web: Request failed: https://microsoft.sharepoint.com/teams/SalesBIDP/SBICustomMap/Maintained by Surface/One Time Load/custom_dealsinfoissue_csv/_api/contextinfo Details: Reason = DataSource.Error;ErrorCode = 10122;DataSourceKind = Web;DataSourcePath = https://microsoft.sharepoint.com/teams/SalesBIDP/SBICustomMap/Maintained%20by%20Surface/One%20Time%20Load/custom_dealsinfoissue_csv/_api/contextinfo
Administrator on 24 Jan 2025 02:09:45
We recommend leveraging Data Pipelines for the orchestration that you're aiming to have. It would give you the most control over what sort of emails you can receive today on failures and successes.
We'll keep this idea here and monitor the demand for it, but we highly encourage you to try setting this logic or even extend it to fit any requirements or future needs through the orchestration capabilities that Data pipelines offer.