擅长:python、mysql、java
<p>你可以试试这样的东西-</p>
<pre><code>import pyspark.sql.functions as F
df_updated = df.withColumn("new value",df.select(F.split(df.value,"/")).rdd.flatMap(
lambda x: x[3:-1]))
</code></pre>
<p>其他参考文件-<a href="https://stackoverflow.com/questions/39235704/split-spark-dataframe-string-column-into-multiple-columns">here</a></p>