Question:
I wanted to know how we can run stored procedure in spark pool (azure synapse) which i have created in dedicated SQL pool. Also can we run SQL queries to access data in ddsql pool in notebook.Answer:
It is possible to do this (eg using an ODBC connection as described here) but you would be better off just using a Synapse Pipeline to do the orchestration:- run a stored Proc activity which places the data you want to work with in a relevant table or storage account
- call a notebook activity using the
spark.read.synapsesql
method as described in detail here.
The pattern:

Is there a particular reason you are copying existing data from the sql pool into Spark? I do a very similar pattern but reserve it for things I can’t already do in SQL, such as sophisticated transform, RegEx, hard maths, complex string manipulation etc
If you have better answer, please add a comment about this, thank you!