分组在子查询内

时间:2017-03-23 16:59:56

标签: sql group-by apache-spark-sql

我在Databricks中使用SQL。以下查询有效,但我也想按名为sale_id的列进行分组。我该怎么做呢?

%sql
select
  (select
     count(distinct time)
  from
    table
  where
    sign_up > 0)
  /
  (select
     count(distinct time)
   from
     table
   where
    action > 0 or
    click > 0)
    as cc3

1 个答案:

答案 0 :(得分:2)

使用条件聚合编写查询:

select (count(distinct case when sign_up > 0 then time end) /
        count(distinct case when action > 0 or click > 0 then time end)
       ) as cc3
from table;

然后group by很简单:

select col,
       (count(distinct case when sign_up > 0 then time end) /
        count(distinct case when action > 0 or click > 0 then time end)
       ) as cc3
from table
group by col;