invalidqueryexception:此操作不支持一致性级别的本地\u one支持的一致性级别为:本地\u仲裁

vx6bjr1n  于 2021-05-27  发布在  Spark
关注(0)|答案(1)|浏览(484)
import org.apache.spark._
import org.apache.spark.SparkContext._
import com.datastax.spark.connector._
import com.datastax.spark.connector.cql.CassandraConnector    
val conf = new SparkConf()
          .setMaster("local[*]")
          .setAppName("XXXX")
          .set("spark.cassandra.connection.host" ,"cassandra.us-east-2.amazonaws.com")
          .set("spark.cassandra.connection.port", "9142")
          .set("spark.cassandra.auth.username", "XXXXX")
          .set("spark.cassandra.auth.password", "XXXXX")
          .set("spark.cassandra.connection.ssl.enabled", "true")
          .set("spark.cassandra.connection.ssl.trustStore.path", "/home/nihad/.cassandra/cassandra_truststore.jks")
          .set("spark.cassandra.connection.ssl.trustStore.password", "XXXXX")
          .set("spark.cassandra.output.consistency.level", "LOCAL_QUORUM")

    val connector = CassandraConnector(conf)
           val session = connector.openSession()
               sesssion.execute("""INSERT INTO "covid19".delta_by_states (state_code, state_value, date ) VALUES ('kl', 5, '2020-03-03');""")
session.close()

我正在尝试使用我的本地系统中的spark应用程序集将数据写入aws cassandra密钥空间。问题是,当我执行上述代码时,会出现如下异常:
“com.datastax.oss.driver.api.core.servererrors.invalidqueryexception:此操作不支持一致性级别local\u one。支持的一致性级别为:“本地仲裁”
从上面的代码中可以看到,我已经在spark conf中将cassandra.output.consistency.level设置为local\u quorum,而且我正在使用datastax cassandra驱动程序。
但当我从美国焊接学会的Cassandra那里读到数据时,效果很好。我还在aws键空间cqlsh中尝试了相同的insert命令。那里也很好。所以查询是有效的。
有人能帮助我如何通过datastax.cassandraconnector设置一致性吗?

vaj7vani

vaj7vani1#

破解了它。
而不是通过spark配置设置cassandra一致性。我在src/main/resources目录中创建了一个application.conf文件。

datastax-java-driver {

        basic.contact-points = [ "cassandra.us-east-2.amazonaws.com:9142"]
    advanced.auth-provider{
        class = PlainTextAuthProvider
        username = "serviceUserName"
        password = "servicePassword"
    }
    basic.load-balancing-policy {
        local-datacenter = "us-east-2"
    }

    advanced.ssl-engine-factory {
        class = DefaultSslEngineFactory
        truststore-path = "yourPath/.cassandra/cassandra_truststore.jks"
        truststore-password = "trustorePassword"
      }

    basic.request.consistency = LOCAL_QUORUM 
    basic.request.timeout = 5 seconds 

}

创建了如下cassandra会话

import com.datastax.oss.driver.api.core.config.DriverConfigLoader
import com.datastax.oss.driver.api.core.CqlSession

val loader = DriverConfigLoader.fromClassPath("application.conf")
val session = CqlSession.builder().withConfigLoader(loader).build()
sesssion.execute("""INSERT INTO "covid19".delta_by_states (state_code, state_value, date ) VALUES ('kl', 5, '2020-03-03');""")

终于成功了。不需要为驱动程序配置修改spark config dochttps://docs.datastax.com/en/drivers/java/4.0/com/datastax/oss/driver/api/core/config/driverconfigloader.html#fromclasspath-java.lang.string文件-
数据税配置单https://docs.datastax.com/en/developer/java-driver/4.6/manual/core/configuration/reference/

相关问题