Spark按行计算标准偏差

时间:2018-04-28 13:33:23

标签: python apache-spark

我需要计算标准偏差行,假设我已经有一个每行计算平均值的列。我试过这个

SD= (reduce(sqrt((add, (abs(col(x)-col("mean"))**2 for x in df.columns[3:])) / n))).alias("SD")
dfS = df.withColumn("SD",SD)
dfS.select("stddev").show()

但是我收到了以下错误

AttributeError: 'builtin_function_or_method' object has no attribute '_get_object_id'

1 个答案:

答案 0 :(得分:0)

您的代码完全混淆了(在当前状态下,它甚至不会导致您在问题中描述的异常)。 sqrt应放在reduce电话:

之外
from pyspark.sql.functions import col, sqrt
from operator import add
from functools import reduce

df = spark.createDataFrame([("_", "_", 2, 1, 2, 3)], ("_1", "_2", "mean"))
cols = df.columns[3:]

sd = sqrt(
    reduce(add, ((col(x) - col("mean")) ** 2 for x in cols)) / (len(cols) - 1)
)

sd
# Column<b'SQRT((((POWER((_4 - mean), 2) + POWER((_5 - mean), 2)) + POWER((_6 - mean), 2)) / 2))'>


df.withColumn("sd", sd).show()
# +---+---+----+---+---+---+---+         
# | _1| _2|mean| _4| _5| _6| sd|
# +---+---+----+---+---+---+---+
# |  _|  _|   2|  1|  2|  3|1.0|
# +---+---+----+---+---+---+---+