java—如何在SpringBatch项目中使用jdbc和hikaricp连接到hive?

z9ju0rcb  于 2021-06-25  发布在  Hive
关注(0)|答案(0)|浏览(470)

我正在尝试在spring批处理项目中使用hikaricp(kerberos和keytab)连接到hivejdbc。
下面是我的jdbcdatasource配置。

@Bean(name = "hiveJdbcBatchDataSource")
@Qualifier(value = "hiveJdbcBatchDataSource")
    public DataSource hiveJdbcBatchDataSource() throws Exception {

        try {
            HikariConfig config = new HikariConfig();
            config.setDriverClassName(driverClassName);
            config.setJdbcUrl(hiveUrl);

            System.setProperty("java.security.krb5.conf", krb5ConfPath);
            if (StringUtils.isNotBlank(keytabPath)) {
                org.apache.hadoop.conf.Configuration conf = new org.apache.hadoop.conf.Configuration();
                conf.set("hadoop.security.authentication", "kerberos");
                UserGroupInformation.setConfiguration(conf);
                UserGroupInformation.loginUserFromKeytab(principal, keytabPath);
            } else {
                System.setProperty("javax.security.auth.useSubjectCredsOnly", "false");
                config.setUsername(userName);
                config.setPassword(password);
            }

            config.setConnectionTestQuery("show databases");
            config.addDataSourceProperty("zeroDateTimeBehavior", zeroDateTimeBehavior);
            config.addDataSourceProperty("cachePrepStmts", cachePrepStmts);
            config.addDataSourceProperty("prepStmtCacheSize", prepStmtCacheSize);
            config.addDataSourceProperty("prepStmtCacheSqlLimit", prepStmtCacheSqlLimit);
            // connection pooling
            config.setPoolName(poolName);
            config.setMaximumPoolSize(maximumPoolSize);
            config.setIdleTimeout(idleTimeoutMs);
            config.setMaxLifetime(maxLifetimeMs);

            return new HikariDataSource(config);

        } catch (IOException e) {
            throw new BeanInitializationException("IOException Failed to init data souce.", e);
        } catch (Exception e) {
            throw new Exception("Exception Failed to init data souce.", e);
        }
    }

我有以下例外

Caused by: org.springframework.batch.core.configuration.BatchConfigurationException: java.lang.IllegalArgumentException: DatabaseType not found for product name: [Apache Hive]
    at org.springframework.batch.core.configuration.annotation.DefaultBatchConfigurer.initialize(DefaultBatchConfigurer.java:119)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at org.springframework.beans.factory.annotation.InitDestroyAnnotationBeanPostProcessor$LifecycleElement.invoke(InitDestroyAnnotationBeanPostProcessor.java:363)
    at org.springframework.beans.factory.annotation.InitDestroyAnnotationBeanPostProcessor$LifecycleMetadata.invokeInitMethods(InitDestroyAnnotationBeanPostProcessor.java:307)
    at org.springframework.beans.factory.annotation.InitDestroyAnnotationBeanPostProcessor.postProcessBeforeInitialization(InitDestroyAnnotationBeanPostProcessor.java:136)
    ... 16 common frames omitted
Caused by: java.lang.IllegalArgumentException: DatabaseType not found for product name: [Apache Hive]
    at org.springframework.batch.support.DatabaseType.fromProductName(DatabaseType.java:84)
    at org.springframework.batch.support.DatabaseType.fromMetaData(DatabaseType.java:123)
    at org.springframework.batch.core.repository.support.JobRepositoryFactoryBean.afterPropertiesSet(JobRepositoryFactoryBean.java:183)
    at org.springframework.batch.core.configuration.annotation.DefaultBatchConfigurer.createJobRepository(DefaultBatchConfigurer.java:134)
    at org.springframework.batch.core.configuration.annotation.DefaultBatchConfigurer.initialize(DefaultBatchConfigurer.java:113)
    ... 23 common frames omitted

我的pom包含以下依赖项

<dependency>
            <groupId>org.apache.hive</groupId>
            <artifactId>hive-jdbc</artifactId>
            <version>1.2.1</version>
        </dependency>
        <dependency>
            <groupId>org.apache.hadoop</groupId>
            <artifactId>hadoop-common</artifactId>
            <version>2.7.3</version>
        </dependency>

注意:我试着按照下面的答案,但仍然得到相同的例外
使用带有自动配置和非标准数据库的spring批处理

暂无答案!

目前还没有任何答案,快来回答吧!

相关问题