擅长:python、mysql、java
<p>或者可以使用Spark SQL中的SQL代码:</p>
<pre class="lang-py prettyprint-override"><code>from pyspark.sql import SparkSession
spark = SparkSession\
.builder\
.master('local[*]')\
.appName('Test')\
.getOrCreate()
spark.sql("""
select driver
,also_item
,unit_count
,ROW_NUMBER() OVER (PARTITION BY driver ORDER BY unit_count DESC) AS rowNum
from data_cooccur
""").show()
</code></pre>