Skip to content

java.lang.ClassCastException: cannot assign instance of scala.collection.immutable.List #66

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
ssappmike opened this issue Aug 17, 2019 · 2 comments

Comments

@ssappmike
Copy link

I have launched a spark-yarn cluster running spark 2.4 scala 2.11.12 hadoop 2.7
When i have tried to run these in amm, I have the exception of ClassCastException:
./coursier launch com.lihaoyi:ammonite_2.11.12:1.6.7 -M ammonite.Main -- --class-based
Loading...
Welcome to the Ammonite Repl 1.6.7
(Scala 2.11.12 Java 1.8.0_222)
If you like Ammonite, please support our development at www.patreon.com/lihaoyi
@ import $ivy.org.apache.spark::spark-sql:2.4.0
import $ivy.$

@ import $ivy.sh.almond::ammonite-spark:0.3.0
import $ivy.$

@ import org.apache.spark.sql._
import org.apache.spark.sql._

@ val spark = {
AmmoniteSparkSession.builder()
.progressBars()
.master("yarn")
.getOrCreate()
}
Loading spark-yarn, spark-stubs
Getting spark JARs
...
@ def sc = spark.sparkContext
defined function sc

@ val rdd = sc.parallelize(1 to 100, 10)
rdd: org.apache.spark.rdd.RDD[Int] = ParallelCollectionRDD[0] at parallelize at cmd5.sc:1

@ val n = rdd.map(_ + 1).sum()
19/08/17 13:18:59 INFO SparkContext: Starting job: sum at cmd6.sc:1
19/08/17 13:18:59 INFO DAGScheduler: Got job 0 (sum at cmd6.sc:1) with 10 output partitions
19/08/17 13:18:59 INFO DAGScheduler: Final stage: ResultStage 0 (sum at cmd6.sc:1)
19/08/17 13:18:59 INFO DAGScheduler: Parents of final stage: List()
19/08/17 13:18:59 INFO DAGScheduler: Missing parents: List()
19/08/17 13:18:59 INFO DAGScheduler: Submitting ResultStage 0 (MapPartitionsRDD[2] at numericRDDToDoubleRDDFunctions at cmd6.sc:1), which has no missing parents
19/08/17 13:18:59 INFO MemoryStore: Block broadcast_0 stored as values in memory (estimated size 2.4 KB, free 2.4 GB)
19/08/17 13:18:59 INFO MemoryStore: Block broadcast_0_piece0 stored as bytes in memory (estimated size 1621.0 B, free 2.4 GB)
19/08/17 13:18:59 INFO BlockManagerInfo: Added broadcast_0_piece0 in memory on lhdev:41433 (size: 1621.0 B, free: 2.4 GB)
19/08/17 13:18:59 INFO SparkContext: Created broadcast 0 from broadcast at DAGScheduler.scala:1161
19/08/17 13:18:59 INFO DAGScheduler: Submitting 10 missing tasks from ResultStage 0 (MapPartitionsRDD[2] at numericRDDToDoubleRDDFunctions at cmd6.sc:1) (first 15 tasks are for partitions Vector(0, 1, 2, 3, 4, 5, 6, 7, 8, 9))
19/08/17 13:18:59 INFO YarnScheduler: Adding task set 0.0 with 10 tasks
19/08/17 13:18:59 INFO TaskSetManager: Starting task 0.0 in stage 0.0 (TID 0, lhpc1, executor 1, partition 0, PROCESS_LOCAL, 7877 bytes)
19/08/17 13:18:59 INFO TaskSetManager: Starting task 1.0 in stage 0.0 (TID 1, lhpc2, executor 2, partition 1, PROCESS_LOCAL, 7877 bytes)
[Stage 0:> (0 + 2) / 10]19/08/17 13:19:02 INFO BlockManagerInfo: Added broadcast_0_piece0 in memory on lhpc2:43083 (size: 1621.0 B, free: 366.3 MB)
19/08/17 13:19:02 INFO TaskSetManager: Starting task 2.0 in stage 0.0 (TID 2, lhpc2, executor 2, partition 2, PROCESS_LOCAL, 7877 bytes)
19/08/17 13:19:02 INFO TaskSetManager: Starting task 3.0 in stage 0.0 (TID 3, lhpc2, executor 2, partition 3, PROCESS_LOCAL, 7877 bytes)
19/08/17 13:19:02 WARN TaskSetManager: Lost task 1.0 in stage 0.0 (TID 1, lhpc2, executor 2): java.lang.ClassCastException: cannot assign instance of scala.collection.immutable.List$SerializationProxy to field org.apache.spark.rdd.RDD.org$apache$spark$rdd$RDD$$dependencies_ of type scala.collection.Seq in instance of org.apache.spark.rdd.MapPartitionsRDD
at java.io.ObjectStreamClass$FieldReflector.setObjFieldValues(ObjectStreamClass.java:2287)
at java.io.ObjectStreamClass.setObjFieldValues(ObjectStreamClass.java:1417)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2293)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2211)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2069)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1573)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2287)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2211)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2069)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1573)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:431)
at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:75)
at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:114)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:83)
at org.apache.spark.scheduler.Task.run(Task.scala:121)
at org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:402)
at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:408)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)

19/08/17 13:19:02 INFO TaskSetManager: Lost task 2.0 in stage 0.0 (TID 2) on lhpc2, executor 2: java.lang.ClassCastException (cannot assign instance of scala.collection.immutable.List$SerializationProxy to field org.apache.spark.rdd.RDD.org$apache$spark$rdd$RDD$$dependencies_ of type scala.collection.Seq in instance of org.apache.spark.rdd.MapPartitionsRDD) [duplicate 1]
19/08/17 13:19:02 INFO TaskSetManager: Starting task 2.1 in stage 0.0 (TID 4, lhpc2, executor 2, partition 2, PROCESS_LOCAL, 7877 bytes)
19/08/17 13:19:02 INFO TaskSetManager: Lost task 3.0 in stage 0.0 (TID 3) on lhpc2, executor 2: java.lang.ClassCastException (cannot assign instance of scala.collection.immutable.List$SerializationProxy to field org.apache.spark.rdd.RDD.org$apache$spark$rdd$RDD$$dependencies_ of type scala.collection.Seq in instance of org.apache.spark.rdd.MapPartitionsRDD) [duplicate 2]
19/08/17 13:19:02 INFO TaskSetManager: Starting task 3.1 in stage 0.0 (TID 5, lhpc2, executor 2, partition 3, PROCESS_LOCAL, 7877 bytes)
19/08/17 13:19:02 INFO TaskSetManager: Lost task 2.1 in stage 0.0 (TID 4) on lhpc2, executor 2: java.lang.ClassCastException (cannot assign instance of scala.collection.immutable.List$SerializationProxy to field org.apache.spark.rdd.RDD.org$apache$spark$rdd$RDD$$dependencies_ of type scala.collection.Seq in instance of org.apache.spark.rdd.MapPartitionsRDD) [duplicate 3]
19/08/17 13:19:02 INFO TaskSetManager: Starting task 2.2 in stage 0.0 (TID 6, lhpc2, executor 2, partition 2, PROCESS_LOCAL, 7877 bytes)
19/08/17 13:19:02 INFO TaskSetManager: Lost task 3.1 in stage 0.0 (TID 5) on lhpc2, executor 2: java.lang.ClassCastException (cannot assign instance of scala.collection.immutable.List$SerializationProxy to field org.apache.spark.rdd.RDD.org$apache$spark$rdd$RDD$$dependencies_ of type scala.collection.Seq in instance of org.apache.spark.rdd.MapPartitionsRDD) [duplicate 4]
19/08/17 13:19:02 INFO TaskSetManager: Starting task 3.2 in stage 0.0 (TID 7, lhpc2, executor 2, partition 3, PROCESS_LOCAL, 7877 bytes)
19/08/17 13:19:02 INFO TaskSetManager: Lost task 2.2 in stage 0.0 (TID 6) on lhpc2, executor 2: java.lang.ClassCastException (cannot assign instance of scala.collection.immutable.List$SerializationProxy to field org.apache.spark.rdd.RDD.org$apache$spark$rdd$RDD$$dependencies_ of type scala.collection.Seq in instance of org.apache.spark.rdd.MapPartitionsRDD) [duplicate 5]
19/08/17 13:19:02 INFO TaskSetManager: Starting task 2.3 in stage 0.0 (TID 8, lhpc2, executor 2, partition 2, PROCESS_LOCAL, 7877 bytes)
19/08/17 13:19:02 INFO TaskSetManager: Lost task 3.2 in stage 0.0 (TID 7) on lhpc2, executor 2: java.lang.ClassCastException (cannot assign instance of scala.collection.immutable.List$SerializationProxy to field org.apache.spark.rdd.RDD.org$apache$spark$rdd$RDD$$dependencies_ of type scala.collection.Seq in instance of org.apache.spark.rdd.MapPartitionsRDD) [duplicate 6]
19/08/17 13:19:02 INFO TaskSetManager: Starting task 3.3 in stage 0.0 (TID 9, lhpc2, executor 2, partition 3, PROCESS_LOCAL, 7877 bytes)
19/08/17 13:19:02 INFO TaskSetManager: Lost task 2.3 in stage 0.0 (TID 8) on lhpc2, executor 2: java.lang.ClassCastException (cannot assign instance of scala.collection.immutable.List$SerializationProxy to field org.apache.spark.rdd.RDD.org$apache$spark$rdd$RDD$$dependencies_ of type scala.collection.Seq in instance of org.apache.spark.rdd.MapPartitionsRDD) [duplicate 7]
19/08/17 13:19:02 ERROR TaskSetManager: Task 2 in stage 0.0 failed 4 times; aborting job
19/08/17 13:19:02 INFO YarnScheduler: Cancelling stage 0
19/08/17 13:19:02 INFO YarnScheduler: Killing all running tasks in stage 0: Stage cancelled
19/08/17 13:19:02 INFO YarnScheduler: Stage 0 was cancelled
19/08/17 13:19:02 INFO TaskSetManager: Lost task 3.3 in stage 0.0 (TID 9) on lhpc2, executor 2: java.lang.ClassCastException (cannot assign instance of scala.collection.immutable.List$SerializationProxy to field org.apache.spark.rdd.RDD.org$apache$spark$rdd$RDD$$dependencies_ of type scala.collection.Seq in instance of org.apache.spark.rdd.MapPartitionsRDD) [duplicate 8]
19/08/17 13:19:02 INFO DAGScheduler: ResultStage 0 (sum at cmd6.sc:1) failed in 3.043 s due to Job aborted due to stage failure: Task 2 in stage 0.0 failed 4 times, most recent failure: Lost task 2.3 in stage 0.0 (TID 8, lhpc2, executor 2): java.lang.ClassCastException: cannot assign instance of scala.collection.immutable.List$SerializationProxy to field org.apache.spark.rdd.RDD.org$apache$spark$rdd$RDD$$dependencies_ of type scala.collection.Seq in instance of org.apache.spark.rdd.MapPartitionsRDD
at java.io.ObjectStreamClass$FieldReflector.setObjFieldValues(ObjectStreamClass.java:2287)
at java.io.ObjectStreamClass.setObjFieldValues(ObjectStreamClass.java:1417)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2293)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2211)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2069)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1573)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2287)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2211)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2069)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1573)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:431)
at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:75)
at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:114)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:83)
at org.apache.spark.scheduler.Task.run(Task.scala:121)
at org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:402)
at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:408)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)

Driver stacktrace:
19/08/17 13:19:02 INFO DAGScheduler: Job 0 failed: sum at cmd6.sc:1, took 3.120537 s
org.apache.spark.SparkException: Job aborted due to stage failure: Task 2 in stage 0.0 failed 4 times, most recent failure: Lost task 2.3 in stage 0.0 (TID 8, lhpc2, executor 2): java.lang.ClassCastException: cannot assign instance of scala.collection.immutable.List$SerializationProxy to field org.apache.spark.rdd.RDD.org$apache$spark$rdd$RDD$$dependencies_ of type scala.collection.Seq in instance of org.apache.spark.rdd.MapPartitionsRDD
at java.io.ObjectStreamClass$FieldReflector.setObjFieldValues(ObjectStreamClass.java:2287)
at java.io.ObjectStreamClass.setObjFieldValues(ObjectStreamClass.java:1417)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2293)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2211)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2069)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1573)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2287)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2211)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2069)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1573)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:431)
at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:75)
at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:114)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:83)
at org.apache.spark.scheduler.Task.run(Task.scala:121)
at org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:402)
at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:408)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)

Driver stacktrace:
org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1887)
org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1875)
org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1874)
scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:48)
org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1874)
org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:926)
org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:926)
scala.Option.foreach(Option.scala:257)
org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:926)
org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:2108)
org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:2057)
org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:2046)
org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:49)
org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:737)
org.apache.spark.SparkContext.runJob(SparkContext.scala:2061)
org.apache.spark.SparkContext.runJob(SparkContext.scala:2158)
org.apache.spark.rdd.RDD$$anonfun$fold$1.apply(RDD.scala:1098)
org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)
org.apache.spark.rdd.RDD.withScope(RDD.scala:363)
org.apache.spark.rdd.RDD.fold(RDD.scala:1092)
org.apache.spark.rdd.DoubleRDDFunctions$$anonfun$sum$1.apply$mcD$sp(DoubleRDDFunctions.scala:35)
org.apache.spark.rdd.DoubleRDDFunctions$$anonfun$sum$1.apply(DoubleRDDFunctions.scala:35)
org.apache.spark.rdd.DoubleRDDFunctions$$anonfun$sum$1.apply(DoubleRDDFunctions.scala:35)
org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)
org.apache.spark.rdd.RDD.withScope(RDD.scala:363)
org.apache.spark.rdd.DoubleRDDFunctions.sum(DoubleRDDFunctions.scala:34)
ammonite.$sess.cmd6$Helper.(cmd6.sc:1)
ammonite.$sess.cmd6$.(cmd6.sc:7)
ammonite.$sess.cmd6$.(cmd6.sc)
java.lang.ClassCastException: cannot assign instance of scala.collection.immutable.List$SerializationProxy to field org.apache.spark.rdd.RDD.org$apache$spark$rdd$RDD$$dependencies_ of type scala.collection.Seq in instance of org.apache.spark.rdd.MapPartitionsRDD
java.io.ObjectStreamClass$FieldReflector.setObjFieldValues(ObjectStreamClass.java:2287)
java.io.ObjectStreamClass.setObjFieldValues(ObjectStreamClass.java:1417)
java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2293)
java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2211)
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2069)
java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1573)
java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2287)
java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2211)
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2069)
java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1573)
java.io.ObjectInputStream.readObject(ObjectInputStream.java:431)
org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:75)
org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:114)
org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:83)
org.apache.spark.scheduler.Task.run(Task.scala:121)
org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:402)
org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:408)
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
java.lang.Thread.run(Thread.java:748)

Any idea?

@ssappmike
Copy link
Author

Besides, I can run the above without problem in spark-shell. It looks like there are some jars conflicts between the jars supplied by ammonite and spark? Would that be the case?

@ssappmike
Copy link
Author

More error log has been found in application log

2019-08-17 18:49:18,363 ERROR repl.ExecutorClassLoader: Failed to check existence of class ammonite.$sess.cmd7$Helper on REPL class server at http://192.168.10.30:44821
java.lang.IllegalArgumentException: Can not create a Path from an empty string
at org.apache.hadoop.fs.Path.checkPathArg(Path.java:126)
at org.apache.hadoop.fs.Path.(Path.java:134)
at org.apache.hadoop.fs.Path.(Path.java:88)
at org.apache.spark.repl.ExecutorClassLoader.getClassFileInputStreamFromFileSystem(ExecutorClassLoader.scala:142)
at org.apache.spark.repl.ExecutorClassLoader.$anonfun$fetchFn$2(ExecutorClassLoader.scala:66)
at org.apache.spark.repl.ExecutorClassLoader.findClassLocally(ExecutorClassLoader.scala:155)
at org.apache.spark.repl.ExecutorClassLoader.findClass(ExecutorClassLoader.scala:109)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:348)
at org.apache.spark.serializer.JavaDeserializationStream$$anon$1.resolveClass(JavaSerializer.scala:67)
at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1868)
at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1751)
at java.io.ObjectInputStream.readClass(ObjectInputStream.java:1716)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1556)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2287)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2211)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2069)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1573)
at java.io.ObjectInputStream.readArray(ObjectInputStream.java:1975)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1567)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2287)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2211)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2069)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1573)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2287)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2211)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2069)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1573)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2287)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2211)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2069)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1573)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:431)
at scala.collection.immutable.List$SerializationProxy.readObject(List.scala:488)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1170)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2178)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2069)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1573)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2287)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2211)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2069)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1573)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2287)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2211)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2069)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1573)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:431)
at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:75)
at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:114)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:83)
at org.apache.spark.scheduler.Task.run(Task.scala:121)
at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:411)
at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
2019-08-17 18:49:18,379 INFO executor.Executor: Executor interrupted and killed task 3.0 in stage 0.0 (TID 3), reason: Stage cancelled

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant