Julian Gimbel on 04 Jan 2024 09:15:44
For Synapse there is a tutorial on how to connect to a dedicated SQL pool from Spark in a notebook.
Having this for Fabric would be awesome.
In best case even running tsql via a tsql notebook magic, and then assigning the result of that query to a spark temp table.
Using the identity of either the user or the managed identity, so that no passwords need to be handled.
Why would you do that? Because complex TSQL queries have been written and rewriting them into sparksql or pyspark is not an easy task.
- Comments (2)
RE: TSQL in Spark Notebooks
Looking for a notebook with:```%%tsqlselect top 100 from lakehouse.dbo.table```
RE: TSQL in Spark Notebooks
Link to Synapse Docshttps://learn.microsoft.com/en-us/azure/synapse-analytics/spark/synapse-spark-sql-pool-import-export