Welcome to OStack Knowledge Sharing Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
934 views
in Technique[技术] by (71.8m points)

apache spark - How to use the PySpark CountVectorizer on columns that maybe null

I have a column in my Spark DataFrame:

 |-- topics_A: array (nullable = true)
 |    |-- element: string (containsNull = true)

I'm using CountVectorizer on it:

topic_vectorizer_A = CountVectorizer(inputCol="topics_A", outputCol="topics_vec_A")

I get NullPointerExceptions, because sometimes the topic_A column contains null.

Is there a way around this? Filling it with a zero-length array would work ok (although it will blow out the data size quite a lot) - but I can't work out how to do a fillNa on an Array column in PySpark.

See Question&Answers more detail:os

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Answer

0 votes
by (71.8m points)

Personally I would drop columns with NULL values because there is no useful information there but you can replace nulls with empty arrays. First some imports:

from pyspark.sql.functions import when, col, coalesce, array

You can define an empty array of specific type as:

fill = array().cast("array<string>")

and combine it with when clause:

topics_a = when(col("topics_A").isNull(), fill).otherwise(col("topics_A"))

or coalesce:

topics_a = coalesce(col("topics_A"), fill)

and use it as:

df.withColumn("topics_A", topics_a)

so with example data:

df = sc.parallelize([(1, ["a", "b"]), (2, None)]).toDF(["id", "topics_A"])

df_ = df.withColumn("topics_A", topics_a)
topic_vectorizer_A.fit(df_).transform(df_)

the result will be:

+---+--------+-------------------+
| id|topics_A|       topics_vec_A|
+---+--------+-------------------+
|  1|  [a, b]|(2,[0,1],[1.0,1.0])|
|  2|      []|          (2,[],[])|
+---+--------+-------------------+

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome to OStack Knowledge Sharing Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

...