我有一节案例课
case class ProbePoint(
var providerId: Int = 0,
//...
var customFields: Map[String, String] = null
)
另外,父java类
public class Trip implements Serializable, Cloneable {
// ...
private Deque<ProbePoint> points = new ArrayDeque<>();
// ...
}
我做了一些有Spark的事 Dataset[Trip]
遇到下一个错误:
org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 6.0 failed 1 times, most recent failure: Lost task 0.0 in stage 6.0 (TID 10, localhost, executor driver): java.util.concurrent.ExecutionException:
org.codehaus.commons.compiler.CompileException: File 'generated.java', Line 289, Column 11: failed to compile: org.codehaus.commons.compiler.CompileException: File 'generated.java', Line 289, Column 11:
Cannot instantiate "scala.collection.Map"
我也尝试过用hashmap将scala类转换为java类,但遇到了下一个例外:
org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 6.0 failed 1 times, most recent failure: Lost task 0.0 in stage 6.0 (TID 10, localhost, executor driver): java.util.concurrent.ExecutionException: org.codehaus.commons.compiler.CompileException: File 'generated.java', Line 423, Column 28: failed to compile:
org.codehaus.commons.compiler.CompileException: File 'generated.java', Line 423, Column 28: No applicable constructor/method found for actual parameters "java.util.Map";
candidates are: "public void com.inrix.analytics.tapp.data.model.ProbePoint.setCustomFields(java.util.HashMap)"
暂无答案!
目前还没有任何答案,快来回答吧!