我正在开发一个内部使用drools的spark程序。代码运行良好。现在,我正在编写测试用例来测试drools代码。因为这个虫子https://issues.redhat.com/browse/drools-329,我不得不添加依赖项
<dependency>
<groupId>org.codehaus.janino</groupId>
<artifactId>janino</artifactId>
<version>2.5.16</version>
</dependency>
然后是jacoco测试的jvm参数
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<configuration>
<systemPropertyVariables>
<jacoco-agent.destfile>${project.build.directory}/jacoco.exec</jacoco-agent.destfile>
</systemPropertyVariables>
<argLine>-Ddrools.dialect.java.compiler=JANINO</argLine>
</configuration>
</plugin>
在添加了spark之后,测试用例开始失败,因为它们依赖于
<dependency>
<groupId>org.codehaus.janino</groupId>
<artifactId>janino</artifactId>
<version>3.0.8</version>
</dependency>
所以,我注解掉了pom中的drools测试和2.5.16 janino依赖。现在Spark代码工作正常。我们如何确保janino2.5.16和janino3.0.8在maven测试构建期间共存,这样就可以通过drools测试和spark测试?
这是我正在使用的drools代码
public static void applyRules(ResultObj obj) {
try {
KnowledgeBuilder kbuilder = KnowledgeBuilderFactory.newKnowledgeBuilder();
kbuilder.add( ResourceFactory.newClassPathResource( "rules/rules.drl", DroolsPreprocessor.class),
ResourceType.DRL );
if ( kbuilder.hasErrors() ) {
System.err.println( kbuilder.getErrors().toString() );
}
StatelessKieSession session = kbuilder.newKieBase().newStatelessKieSession();
session.execute(obj);
System.out.println("Reached the end ");
} catch (Exception e) {
e.printStackTrace();
}
}
这是口水依赖
<dependency>
<groupId>org.drools</groupId>
<artifactId>drools-decisiontables</artifactId>
<version>7.7.0.Final</version>
</dependency>
<dependency>
<groupId>org.drools</groupId>
<artifactId>drools-core</artifactId>
<version>7.7.0.Final</version>
</dependency>
<dependency>
<groupId>org.drools</groupId>
<artifactId>drools-compiler</artifactId>
<version>7.7.0.Final</version>
</dependency>
对于janino 2.5.16,spark并不满意它抛出了这个异常
Exception in thread "main" java.lang.NoClassDefFoundError: org/codehaus/janino/InternalCompilerException
at org.apache.spark.sql.catalyst.expressions.codegen.JavaCode$.variable(javaCode.scala:63)
at org.apache.spark.sql.catalyst.expressions.codegen.JavaCode$.isNullVariable(javaCode.scala:76)
at org.apache.spark.sql.catalyst.expressions.Expression.$anonfun$genCode$3(Expression.scala:145)
at scala.Option.getOrElse(Option.scala:189)
at org.apache.spark.sql.catalyst.expressions.Expression.genCode(Expression.scala:141)
at org.apache.spark.sql.catalyst.expressions.codegen.CodegenContext.$anonfun$generateExpressions$1(CodeGenerator.scala:1170)
at scala.collection.TraversableLike.$anonfun$map$1(TraversableLike.scala:238)
at scala.collection.mutable.ResizableArray.foreach(ResizableArray.scala:62)
at scala.collection.mutable.ResizableArray.foreach$(ResizableArray.scala:55)
at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:49)
at scala.collection.TraversableLike.map(TraversableLike.scala:238)
at scala.collection.TraversableLike.map$(TraversableLike.scala:231)
at scala.collection.AbstractTraversable.map(Traversable.scala:108)
at org.apache.spark.sql.catalyst.expressions.codegen.CodegenContext.generateExpressions(CodeGenerator.scala:1170)
at org.apache.spark.sql.catalyst.expressions.codegen.GenerateUnsafeProjection$.createCode(GenerateUnsafeProjection.scala:290)
at org.apache.spark.sql.catalyst.expressions.codegen.GenerateUnsafeProjection$.create(GenerateUnsafeProjection.scala:338)
at org.apache.spark.sql.catalyst.expressions.codegen.GenerateUnsafeProjection$.generate(GenerateUnsafeProjection.scala:327)
at org.apache.spark.sql.catalyst.expressions.UnsafeProjection$.createCodeGeneratedObject(Projection.scala:123)
at org.apache.spark.sql.catalyst.expressions.UnsafeProjection$.createCodeGeneratedObject(Projection.scala:119)
at org.apache.spark.sql.catalyst.expressions.CodeGeneratorWithInterpretedFallback.createObject(CodeGeneratorWithInterpretedFallback.scala:52)
at org.apache.spark.sql.catalyst.expressions.UnsafeProjection$.create(Projection.scala:150)
at org.apache.spark.sql.catalyst.expressions.UnsafeProjection$.create(Projection.scala:143)
at org.apache.spark.sql.catalyst.expressions.UnsafeProjection$.create(Projection.scala:135)
at org.apache.spark.sql.kafka010.KafkaRecordToRowConverter.<init>(KafkaRecordToRowConverter.scala:36)
at org.apache.spark.sql.kafka010.KafkaRelation.<init>(KafkaRelation.scala:52)
at org.apache.spark.sql.kafka010.KafkaSourceProvider.createRelation(KafkaSourceProvider.scala:150)
at org.apache.spark.sql.execution.datasources.DataSource.resolveRelation(DataSource.scala:339)
at org.apache.spark.sql.DataFrameReader.loadV1Source(DataFrameReader.scala:279)
at org.apache.spark.sql.DataFrameReader.$anonfun$load$2(DataFrameReader.scala:268)
at scala.Option.getOrElse(Option.scala:189)
at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:268)
at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:203)
如果我不添加janino2.5.16依赖,我会得到以下测试错误
java.lang.RuntimeException: wrong class format
at org.drools.compiler.commons.jci.compilers.EclipseJavaCompiler$2.findType(EclipseJavaCompiler.java:302)
at org.drools.compiler.commons.jci.compilers.EclipseJavaCompiler$2.findType(EclipseJavaCompiler.java:255)
at org.eclipse.jdt.internal.compiler.lookup.LookupEnvironment.askForType(LookupEnvironment.java:174)
at org.eclipse.jdt.internal.compiler.lookup.PackageBinding.getTypeOrPackage(PackageBinding.java:214)
at org.eclipse.jdt.internal.compiler.lookup.CompilationUnitScope.findImport(CompilationUnitScope.java:466)
at org.eclipse.jdt.internal.compiler.lookup.CompilationUnitScope.findSingleImport(CompilationUnitScope.java:520)
at org.eclipse.jdt.internal.compiler.lookup.CompilationUnitScope.faultInImports(CompilationUnitScope.java:397)
at org.eclipse.jdt.internal.compiler.lookup.CompilationUnitScope.faultInTypes(CompilationUnitScope.java:445)
at org.eclipse.jdt.internal.compiler.Compiler.process(Compiler.java:860)
at org.eclipse.jdt.internal.compiler.Compiler.processCompiledUnits(Compiler.java:550)
at org.eclipse.jdt.internal.compiler.Compiler.compile(Compiler.java:462)
at org.eclipse.jdt.internal.compiler.Compiler.compile(Compiler.java:417)
at org.drools.compiler.commons.jci.compilers.EclipseJavaCompiler.compile(EclipseJavaCompiler.java:436)
at org.drools.compiler.commons.jci.compilers.AbstractJavaCompiler.compile(AbstractJavaCompiler.java:49)
at org.drools.compiler.rule.builder.dialect.java.JavaDialect.compileAll(JavaDialect.java:420)
at org.drools.compiler.compiler.DialectCompiletimeRegistry.compileAll(DialectCompiletimeRegistry.java:61)
at org.drools.compiler.compiler.PackageRegistry.compileAll(PackageRegistry.java:109)
at org.drools.compiler.builder.impl.KnowledgeBuilderImpl.compileAll(KnowledgeBuilderImpl.java:1471)
at org.drools.compiler.builder.impl.KnowledgeBuilderImpl.wireAllRules(KnowledgeBuilderImpl.java:950)
at org.drools.compiler.builder.impl.KnowledgeBuilderImpl.addPackage(KnowledgeBuilderImpl.java:938)
at org.drools.compiler.builder.impl.KnowledgeBuilderImpl.addPackageFromDrl(KnowledgeBuilderImpl.java:543)
at org.drools.compiler.builder.impl.KnowledgeBuilderImpl.addKnowledgeResource(KnowledgeBuilderImpl.java:743)
at org.drools.compiler.builder.impl.KnowledgeBuilderImpl.add(KnowledgeBuilderImpl.java:2288)
at org.drools.compiler.builder.impl.KnowledgeBuilderImpl.add(KnowledgeBuilderImpl.java:2277)
暂无答案!
目前还没有任何答案,快来回答吧!