Thibauld Croonenborghs on 25 Sep 2024 09:16:16
As is now the case for notebooks in Data Factory pipelines in Fabric, I would like to see Spark session sharing among Spark job definition executions in a pipeline.
- Comments (1)
Comments (1)
RE: Re-use Spark session across Spark job definition executions in a pipeline
High concurrency mode for Spark job definitions in ADF, same as for notebookshttps://blog.fabric.microsoft.com/en-US/blog/introducing-high-concurrency-mode-for-notebooks-in-pipelines-for-fabric-spark/