Open
Description
Hi,
I setup almond kernel on windows. It's work well with scala and spark local. But i want to use remote hdfs and remote yarn to submit job. So i need to override some configs like this
hadoop config
fs.defaultFS
hadoop.security.authentication
hadoop.http.authentication.type
spark config
spark.yarn.stagingDir
spark.history.fs.logDirectory
All of these configs existed in spark-default file and hadoop config dir. I already set system env var and also set env var in kernel json like this. But almond kernel still not use these configs
"HADOOP_HOME": "D:\\DATA\\Environment\\hadoop-2.7.2",
"HADOOP_CONF_DIR": "D:\\DATA\\Environment\\hadoop-2.7.2\\etc\\hadoop",
"SPARK_HOME": "D:\\DATA\\Environment\\spark-3.2.2-bin-hadoop2.7",
"SPARK_CONF_DIR": "D:\\DATA\\Environment\\spark-3.2.2-bin-hadoop2.7\\conf",
When i open spark ui, view environment. It still use default config of almond instead my override config
How to override config above and use it from spark-default and hadoop config dir. Thank you
Metadata
Metadata
Assignees
Labels
No labels