Assert失败:运行时反射宇宙中的不安全符号时间戳(子项为< none>)

von4xj4u  于 2021-07-13  发布在  Spark
关注(0)|答案(1)|浏览(347)

我们使用scala编程语言和spark作为数据处理工具。我们有3个不同的项目
项目a//我们在其中定义了此注解

  1. package Common {
  2. case class TimeStamp(format: DateFormat) extends StaticAnnotation
  3. }

项目b//这里有一个case类,它使用上面的注解

  1. case class CommonSchema(
  2. @Common.TimeStamp(DateFormat.DateTime)
  3. __datetime: Timestamp
  4. )

项目c//我们使用上面的case类作为类型

  1. val events = events.map(c => CommonSchema(
  2. c.__datetime,
  3. ))
  4. events.as[CommonSchema]

我们在c项目中遇到了以下异常

  1. User class threw exception: java.lang.RuntimeException: error reading Scala signature of commonschema: assertion failed: unsafe symbol TimeStamp (child of <none>) in runtime reflection universe
  2. at scala.reflect.internal.pickling.UnPickler.unpickle(UnPickler.scala:46)
  3. at scala.reflect.runtime.JavaMirrors$JavaMirror.unpickleClass(JavaMirrors.scala:619)
  4. at scala.reflect.runtime.SymbolLoaders$TopClassCompleter$$anonfun$complete$1.apply$mcV$sp(SymbolLoaders.scala:28)
  5. at scala.reflect.runtime.SymbolLoaders$TopClassCompleter$$anonfun$complete$1.apply(SymbolLoaders.scala:25)
  6. at scala.reflect.runtime.SymbolLoaders$TopClassCompleter$$anonfun$complete$1.apply(SymbolLoaders.scala:25)
  7. at scala.reflect.internal.SymbolTable.slowButSafeEnteringPhaseNotLaterThan(SymbolTable.scala:263)
  8. at scala.reflect.runtime.SymbolLoaders$TopClassCompleter.complete(SymbolLoaders.scala:25)
  9. at scala.reflect.runtime.SymbolLoaders$TopClassCompleter.load(SymbolLoaders.scala:33)
  10. at scala.reflect.runtime.SynchronizedSymbols$SynchronizedSymbol$$anonfun$typeParams$1.apply(SynchronizedSymbols.scala:140)
  11. at scala.reflect.runtime.SynchronizedSymbols$SynchronizedSymbol$$anonfun$typeParams$1.apply(SynchronizedSymbols.scala:133)
  12. at scala.reflect.runtime.Gil$class.gilSynchronized(Gil.scala:19)
  13. at scala.reflect.runtime.JavaUniverse.gilSynchronized(JavaUniverse.scala:16)
  14. at scala.reflect.runtime.SynchronizedSymbols$SynchronizedSymbol$class.gilSynchronizedIfNotThreadsafe(SynchronizedSymbols.scala:123)
  15. at scala.reflect.runtime.SynchronizedSymbols$SynchronizedSymbol$$anon$8.gilSynchronizedIfNotThreadsafe(SynchronizedSymbols.scala:168)
  16. at scala.reflect.runtime.SynchronizedSymbols$SynchronizedSymbol$class.typeParams(SynchronizedSymbols.scala:132)
  17. at scala.reflect.runtime.SynchronizedSymbols$SynchronizedSymbol$$anon$8.typeParams(SynchronizedSymbols.scala:168)
  18. at scala.reflect.internal.Types$NoArgsTypeRef.typeParams(Types.scala:1931)
  19. at scala.reflect.internal.Types$NoArgsTypeRef.isHigherKinded(Types.scala:1930)
  20. at scala.reflect.internal.tpe.TypeComparers$class.isSubType2(TypeComparers.scala:377)
  21. at scala.reflect.internal.tpe.TypeComparers$class.isSubType1(TypeComparers.scala:320)
  22. at scala.reflect.internal.tpe.TypeComparers$class.isSubType(TypeComparers.scala:278)
  23. at scala.reflect.internal.SymbolTable.isSubType(SymbolTable.scala:16)
  24. at scala.reflect.internal.Types$Type.$less$colon$less(Types.scala:784)
  25. at scala.reflect.internal.Types$Type.$less$colon$less(Types.scala:260)
  26. at org.apache.spark.sql.catalyst.ScalaReflection$.isSubtype(ScalaReflection.scala:83)
  27. at org.apache.spark.sql.catalyst.ScalaReflection$$anonfun$optionOfProductType$1.apply$mcZ$sp(ScalaReflection.scala:677)
  28. at org.apache.spark.sql.catalyst.ScalaReflection$$anonfun$optionOfProductType$1.apply(ScalaReflection.scala:676)
  29. at org.apache.spark.sql.catalyst.ScalaReflection$$anonfun$optionOfProductType$1.apply(ScalaReflection.scala:676)
  30. at scala.reflect.internal.tpe.TypeConstraints$UndoLog.undo(TypeConstraints.scala:56)
  31. at org.apache.spark.sql.catalyst.ScalaReflection$class.cleanUpReflectionObjects(ScalaReflection.scala:926)
  32. at org.apache.spark.sql.catalyst.ScalaReflection$.cleanUpReflectionObjects(ScalaReflection.scala:49)
  33. at org.apache.spark.sql.catalyst.ScalaReflection$.optionOfProductType(ScalaReflection.scala:675)
  34. at org.apache.spark.sql.catalyst.encoders.ExpressionEncoder$.apply(ExpressionEncoder.scala:51)
  35. at org.apache.spark.sql.Encoders$.product(Encoders.scala:275)

link1和link2中也回答了类似的问题,但是我们已经为项目c构建了uberjar,所以所有的类都应该包含在包中。在本地,命中此代码的单元测试可以正常工作,但是为什么会出现运行时错误呢?

qlckcl4x

qlckcl4x1#

这是因为我们使用maven shade插件并最小化jar。我们对配置进行了更改,这样就不会从jar中删除这些类。它解决了这个问题

相关问题