pyspark sql jdbc错误-关键字'SET'附近的语法不正确

时间:2018-09-07 11:25:11

标签: sql sql-server jdbc pyspark connection

以下是我试图运行的SQL示例,但它给了我错误-***。jdbc.SQLServerException:关键字'SET'附近的语法不正确

tp_temp = """
SET NOCOUNT ON
declare @lp_dt dt = (select max(pt_dt) from [DB_TABLE].[db].[DATA] where p_type='WSD')
declare @L_dte tb (pt, pt_type char(1))

insert into @L_dte  SELECT pt_dt, pt_type
  FROM from [DB_TABLE].[db].[DATA]
  where (PT_Type = 'Z' and Month(pt_dt) in (1,4,7,10) and pt_dt > '2017 Sep 1') or (PT_Type = 'K' and pt_dt = @lp_dt )
group by pt_dt, PT_Type
order by pt_dt

SELECT z_id_alt, z_id, z_Status_Alt, c_Gp_CCD, Gender, Age, ctk as CTK_Red, CAT_Title, temp_Group, Comp_price, terminator_action_alt AS Terminator_Action, v_pt.PT_Dt,CASE 
    WHEN COALESCE([Acb_Indicator], '') = ''
        THEN ('No')
    ELSE ('Yes')
    END AS  [Is Acb],
                     COALESCE(nullif([Region_Desc_NEWCO_Original], ''), Region_Desc_NEWCO) AS Region_Desc_NEWCO,
                     COALESCE(nullif([POG3_Desc_Original], ''), POG3_Desc) AS  POG3_Desc,
                      COALESCE(nullif([POG9_Desc_Original], ''), POG9_Desc) AS POG9_Desc, COALESCE(nullif(BSD_Function, ''), OrgUnit_Function_L2_NEWCO) as OrgUnit_Function_L2_NEWCO, CASE 
    WHEN COALESCE(nullif([POG3_Desc_Original], ''), POG3_Desc) IN (
            SELECT [POG_Desc]
            FROM [DB_FREE].[dbz].[Free_dom]
            WHERE POG_Type = 'POG3'
            )
        OR COALESCE(nullif([POG9_Desc_Original], ''), POG9_Desc) IN (
            SELECT [POG_Desc]
            FROM [DB_FREE].[dbz].[Free_dom]
            WHERE POG_Type = 'POG9'
            )
        THEN 'Yes'
    ELSE 'No'
    END AS [Restricted_Home], Zero) AS Booring
    FROM   (SELECT * FROM [DB_FREE].[dbz].[Free_dom] WHERE noshow = 'N' AND E_Type <> 'SubConsole') as v_pt
    INNER JOIN @l_pt_dt as pt_dt on v_pt.pt_dt = pt_dt.pt_dt and v_pt.pt_type = pt_dt.pt_type
    LEFT OUTER JOIN [DB_TABLE].[db].[DATA] AS o_p ON o_p.[u_number] = v_pt.ete_idg """

df_temp= spark.read.jdbc(url=jdbcUrl, table=tp_temp, properties=connectionProperties)

对于以Select命令开头的简单SQL来说,它可以工作,但是对于以SET或WITH开头的代码,它会引发错误,请让我知道如何从pyspark数据框中的此SQL代码创建表。

0 个答案:

没有答案