Spark DataFrame相当于pandas.DataFrame.set

编程入门 行业动态 更新时间:2024-10-28 14:22:32
Spark DataFrame相当于pandas.DataFrame.set_index / drop_duplicates与dropDuplicates(Spark DataFrame equivalent of pandas.DataFrame.set_index / drop_duplicates vs. dropDuplicates)

Spark DataFrames的drop duplicates方法不起作用,我认为这是因为作为我的数据集一部分的索引列被视为一列数据。 肯定有重复,我通过在除索引之外的所有列上比较COUNT()和COUNT(DISTINCT())来检查它。 我是Spark DataFrames的新手,但如果我使用Pandas,此时我会在该列上执行pandas.DataFrame.set_index 。

有谁知道如何处理这种情况?

其次,Spark DataFrame上有两种方法, drop_duplicates和dropDuplicates 。 它们是一样的吗?

The drop duplicates methods of Spark DataFrames is not working and I think it is because the index column which was part of my dataset is being treated as a column of data. There definitely are duplicates in there, I checked it by comparing COUNT() and COUNT(DISTINCT()) on all the columns except the index. I'm new to Spark DataFrames but if I was using Pandas, at this point I would do pandas.DataFrame.set_index on that column.

Does anyone know how to handle this situation?

Secondly, there appears to be 2 methods on a Spark DataFrame, drop_duplicates and dropDuplicates. Are they the same?

最满意答案

如果您不希望在检查不同记录时考虑索引列,则可以使用以下命令删除列,或仅选择所需的列。

df = df.drop('p_index') // Pass column name to be dropped df = df.select('name', 'age') // Pass the required columns

drop_duplicates()是dropDuplicates()的别名。

https://spark.apache.org/docs/latest/api/python/pyspark.sql.html#pyspark.sql.DataFrame.dropDuplicates

If you don't want the index column to be considered while checking for the distinct records, you can drop the column using below command or select only the columns required.

df = df.drop('p_index') // Pass column name to be dropped df = df.select('name', 'age') // Pass the required columns

drop_duplicates() is an alias for dropDuplicates().

https://spark.apache.org/docs/latest/api/python/pyspark.sql.html#pyspark.sql.DataFrame.dropDuplicates

更多推荐

本文发布于:2023-07-28 23:20:00,感谢您对本站的认可!
本文链接:https://www.elefans.com/category/jswz/34/1310097.html
版权声明:本站内容均来自互联网,仅供演示用,请勿用于商业和其他非法用途。如果侵犯了您的权益请与我们联系,我们将在24小时内删除。
本文标签:DataFrame   Spark   set   pandas

发布评论

评论列表 (有 0 条评论)
草根站长

>www.elefans.com

编程频道|电子爱好者 - 技术资讯及电子产品介绍!