问题描述
我使用以下代码创建了数据框:
df = spark.createDataFrame([("A","20"),("B","30"),("D","80"),("A","120"),("c",("Null","20")],["Let","Num"])
df.show()
+----+---+
| Let|Num|
+----+---+
| A| 20|
| B| 30|
| D| 80|
| A|120|
| c| 20|
|Null| 20|
+----+---+
我要创建如下数据框:
+----+-------+
| Let|Num |
+----+-------+
| A| 20,120|
| B| 30 |
| D| 80 |
| c| 20 |
|Null| 20 |
+----+-------+
如何实现?
解决方法
您可以groupBy
collect_list
出租并收集为物品
from pyspark.sql import functions as F
df.groupBy("Let").agg(F.collect_list("Num")).show()
输出为列表:
+----+-----------------+
| Let|collect_list(Num)|
+----+-----------------+
| B| [30]|
| D| [80]|
| A| [20,120]|
| c| [20]|
|Null| [20]|
+----+-----------------+
df.groupBy("Let").agg(F.concat_ws(",",F.collect_list("Num"))).show()
以字符串形式输出
+----+-------------------------------+
| Let|concat_ws(,collect_list(Num))|
+----+-------------------------------+
| B| 30|
| D| 80|
| A| 20,120|
| c| 20|
|Null| 20|
+----+-------------------------------+