<p>我使用的是spark-2.1.0-bin-hadoop2.7、Scala 2.11.8和python3.5。在</p>
<p>对于spark mongo连接器,我使用的是来自<a href="http://repo1.maven.org/maven2/org/mongodb/spark/mongo-spark-connector_2.11/2.0.0/" rel="nofollow noreferrer">http://repo1.maven.org/maven2/org/mongodb/spark/mongo-spark-connector_2.11/2.0.0/</a>的“mongo-spark-connector_2.11-2.0.0.jar”</p>
<p>现在,将jar放入../spark-2.1.0-bin-hadoop2.7/jars路径后,我将启动./pyspark,如下所示:mongodoc官方站点<a href="https://docs.mongodb.com/spark-connector/master/python-api/#tutorials" rel="nofollow noreferrer">https://docs.mongodb.com/spark-connector/master/python-api/#tutorials</a></p>
<pre><code>./bin/pyspark --conf "spark.mongodb.input.uri=mongodb://127.0.0.1/test.myCollection?readPreference=primaryPreferred"
--conf "spark.mongodb.output.uri=mongodb://127.0.0.1/test.myCollection" \
--packages org.mongodb.spark:mongo-spark-connector_2.11:2.0.0
</code></pre>
<p>一切都很好,我的sparksession对象是spark。在</p>
<p>现在,当我试图将一个集合加载到数据帧中时</p>
^{pr2}$
<p>我得到了这个错误:</p>
<pre><code>py4j.protocol.Py4JJavaError: An error occurred while calling o36.load.
: java.lang.NoClassDefFoundError: com/mongodb/ConnectionString
at com.mongodb.spark.config.MongoCompanionConfig$$anonfun$4.apply(MongoCompanionConfig.scala:278)
at com.mongodb.spark.config.MongoCompanionConfig$$anonfun$4.apply(MongoCompanionConfig.scala:278)
at scala.util.Try$.apply(Try.scala:192)
at com.mongodb.spark.config.MongoCompanionConfig$class.connectionString(MongoCompanionConfig.scala:278)
at com.mongodb.spark.config.ReadConfig$.connectionString(ReadConfig.scala:39)
at com.mongodb.spark.config.ReadConfig$.apply(ReadConfig.scala:51)
at com.mongodb.spark.config.ReadConfig$.apply(ReadConfig.scala:39)
at com.mongodb.spark.config.MongoCompanionConfig$class.apply(MongoCompanionConfig.scala:124)
at com.mongodb.spark.config.ReadConfig$.apply(ReadConfig.scala:39)
at com.mongodb.spark.config.MongoCompanionConfig$class.apply(MongoCompanionConfig.scala:113)
at com.mongodb.spark.config.ReadConfig$.apply(ReadConfig.scala:39)
at com.mongodb.spark.sql.DefaultSource.createRelation(DefaultSource.scala:67)
at com.mongodb.spark.sql.DefaultSource.createRelation(DefaultSource.scala:50)
at org.apache.spark.sql.execution.datasources.DataSource.resolveRelation(DataSource.scala:330)
at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:152)
at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:125)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244)
at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
at py4j.Gateway.invoke(Gateway.java:280)
at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)
at py4j.commands.CallCommand.execute(CallCommand.java:79)
at py4j.GatewayConnection.run(GatewayConnection.java:214)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.ClassNotFoundException: com.mongodb.ConnectionString
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 27 more
</code></pre>
<p>我就是不明白:我做错了什么,我错过了什么</p>
<p>-我需要添加任何依赖的jar吗</p>
<p>-或者此连接器仅适用于2.0.x,而不适用于spark2.1.0(从未在spark2.0.x上尝试过)</p>
<ul>
<li>如果在mongoDB上有一个授权,那么我的uri会是什么样子呢</li>
</ul>
<p>-因为它显示了connectionString,但我已经重新检查了100次(在pymongo连接url中也是如此)。在</p>
<p>—也对会话生成器对象尝试了相同的方法,但没有结果</p>
<pre><code>from pyspark.sql import SparkSession
my_spark = SparkSession \
.builder \
.appName("myApp") \
.config("spark.mongodb.input.uri", "mongodb://127.0.0.1/test.coll") \
.config("spark.mongodb.output.uri", "mongodb://127.0.0.1/test.coll") \
.getOrCreate()
</code></pre>
<p>请帮我解决这个问题。提前谢谢。在</p>