You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
i am using almond in jupyter, and run spark on k8s in almond;
the first time i use original SparkSession to build SparkSession, i met that can not use map functions in almond, casue serialization problems;
so i use NoteBookSparkSession to build SparkSession, i met no FileSystem schema http, that cause executor can not start up; so i put spark-stubs in executor pod jars dir and remove spark-repl jar in jars, because spark-repl conflicts with spark-stubs.
finally executor pod started up, but not, every action operator runs error like:
Uh oh!
There was an error while loading. Please reload this page.
env:
jupyter-lab
spark:2.4.8
almond:0.6.0
scala: 2.11.12
i am using almond in jupyter, and run spark on k8s in almond;
the first time i use original SparkSession to build SparkSession, i met that can not use map functions in almond, casue serialization problems;
so i use NoteBookSparkSession to build SparkSession, i met no FileSystem schema http, that cause executor can not start up; so i put spark-stubs in executor pod jars dir and remove spark-repl jar in jars, because spark-repl conflicts with spark-stubs.
finally executor pod started up, but not, every action operator runs error like:
then i see the source code, found it was because the TaskDescription.addedJars was Empty.
The text was updated successfully, but these errors were encountered: