通过连接多个DataFrame

时间:2017-02-03 10:24:25

标签: scala apache-spark

我有多个数据帧我需要连接地址和基于zip的条件。实际上我有sql查询,我需要转换为dataframe join 我编写了UDF,它可以很好地连接多个列以获得单个列,

val getConcatenated = udf( (first: String, second: String,third: String,fourth: String,five: String,six: String) => { first + "," + second + "," +third + "," +fourth + "," +five + "," +six } )

MySQl查询

  select 
CONCAT(al.Address1,',',al.Address2,',',al.Zip) AS AtAddress,
CONCAT(rl.Address1,',',rl.Address2,',',rl.Zip) AS RtAddress,
CONCAT(d.Address1,',',d.Address2,','d.Zip) AS DAddress,
CONCAT(s.Address1,',',s.Address2,',',s.Zip) AS SAGddress,
CONCAT(vl.Address1,',',vl.Address2,',vl.Zip) AS VAddress,
CONCAT(sg.Address1,',',sg.Address2,',sg.Zip) AS SAGGddress
FROM
si s inner join 
at a on s.cid = a.cid and s.cid =a.cid
inner join De d on s.cid = d.cid AND d.aid = a.aid 
inner join SGrpM sgm on s.cid = sgm.cid and s.sid =sgm.sid and sgm.status=1
inner join SeG sg on sgm.cid =sg.cid and sgm.gid =sg.gid 
inner join bd bu on s.cid = bu.cid and s.sid =bu.sid
inner join locas al on a.ALId = al.lid
inner join locas rl on a.RLId = rl.lid
inner join locas vl on a.VLId = vl.lid

我在加入数据帧时面临问题,这会给我空值。

val DS = DS_SI.join(at,Seq("cid","sid"),"inner").join(DS_DE,Seq("cid","aid"),"inner") .join(DS_SGrpM,Seq("cid","sid"),"inner").join(DS_SG,Seq("cid","gid"),"inner") .join(at,Seq("cid","sid"),"inner")
.join(DS_BD,Seq("cid","sid"),"inner").join(DS_LOCAS("ALId") <=> DS_LOCATION("lid") && at("RLId") <=> DS_LOCAS("lid")&& at("VLId") <=> DS_LOCAS("lid"),"inner")

Iam尝试加入我上面的dataFrames,但是没有给出正确的结果然后我想通过添加列来连接     .withColumn(&#34; AtAddress&#34;,getConcatenated())     .withColumn(&#34; RtAddress&#34;,getConcatenated())....

任何人都告诉我,我们能够有效地实现这一目标,我是否正确加入数据框架或者更好的方法......

1 个答案:

答案 0 :(得分:0)

您可以使用concat_ws(separator, columns_to_concat)

示例:

import org.apache.spark.sql.functions._
df.withColumn("title", concat_ws(", ", DS_DE("Address2"), DS_DE("Address2"), DS_DE("Zip")))