无法在 Azure Synapse 中从火花池写入 sql 池

问题描述

我在认 Spark 池中有一个表,我需要将它加载到 azure 突触中的专用 sql 池中。以下是我实现的代码,但没有加载。

%%pyspark
spark.conf.set("spark.sql.execution.arrow.pyspark.enabled","true")
new_df = spark.createDataFrame(segmentation_output)
new_df.write.mode("overwrite").saveAsTable("default.segmentation_output")


%%pyspark
new_df.createOrReplaceTempView("pysparkdftemptable")


%%spark
val scala_df = spark.sqlContext.sql ("select * from pysparkdftemptable")
scala_df.write.synapsesql("eana.bi.xim_CustomerSegment",Constants.INTERNAL)



Error : StructuredStream-spark package version: 2.4.5-1.3.1
StructuredStream-spark package version: 2.4.5-1.3.1
StructuredStream-spark package version: 2.4.5-1.3.1
com.microsoft.sqlserver.jdbc.sqlServerException: The TCP/IP connection to the host eec8e890e9d5--0.tr624.northeurope1-a.worker.database.windows.net (redirected from emna-dv-ibsanalytics-wco-id-euno-sqs.database.windows.net),port 11030 has Failed. Error: "connect timed out. Verify the connection properties. Make sure that an instance of sql Server is running on the host and accepting TCP/IP connections at the port. Make sure that TCP connections to the port are not blocked by a firewall.".

解决方法

暂无找到可以解决该程序问题的有效方法,小编努力寻找整理中!

如果你已经找到好的解决方法,欢迎将解决方案带上本链接一起发送给小编。

小编邮箱:dio#foxmail.com (将#修改为@)