I have a table with a array type column named writer
which has the values like array[value1, value2]
, array[value2, value3]
.... etc.
I am doing self join
to get results which have common values between arrays. I tried:
sqlContext.sql("SELECT R2.writer FROM table R1 JOIN table R2 ON R1.id != R2.id WHERE ARRAY_INTERSECTION(R1.writer, R2.writer)[0] is not null ")
And
sqlContext.sql("SELECT R2.writer FROM table R1 JOIN table R2 ON R1.id != R2.id WHERE ARRAY_INTERSECT(R1.writer, R2.writer)[0] is not null ")
But got same exception:
Exception in thread "main" org.apache.spark.sql.AnalysisException:
Undefined function: 'ARRAY_INTERSECT'. This function is neither a
registered temporary function nor a permanent function registered in
the database 'default'.; line 1 pos 80
Probably Spark SQL does not support ARRAY_INTERSECTION
and ARRAY_INTERSECT
. How can I achieve my goal in Spark SQL
?
See Question&Answers more detail:
os 与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…