一尘不染

ElasticSearch Spark错误

elasticsearch

我是ElasticSearch的新手,我正在尝试编写一些Apache
Spark代码以将一些数据保存到ElasticSearch中。我在SparkShell中输入了以下几行:

 import org.elasticsearch.spark._
 val myMap = Map("France" -> "FRA", "United States" -> "US")
 val myRDD = sc.makeRDD(Seq(myMap))
 myRDD.saveToEs("Country/Abbrv")

错误:

 org.elasticsearch.hadoop.EsHadoopIlegalArgumentException:  Cannot determine write shards for [Country/Abbrv]; likely its format is incorrect (maybe it contains illegal characters?)

Spark 2.0.0 ElasticSearch-Spark 2.3.4

有任何想法吗?


阅读 440

收藏
2020-06-22

共1个答案

一尘不染

问题是我没有在启动spark shell之前设置–conf var。它需要如下所示:

 spark-shell --jars {path}/elasticsearch-spark_2.11-2.3.4.jar --conf spark.es.resource=Country/Abbrv
2020-06-22