and How to set these memories when run ./bin/ddf-shell on spark standalone with master=local
I found that the spark params are restricted in the code io.spark.ddf.SparkDDFManager
to be :
private static final String[][] SPARK_ENV_VARS = new String[][] {
// @Formatter:off
{ "SPARK_APPNAME", "spark.appname" },
{ "SPARK_MASTER", "spark.master" },
{ "SPARK_HOME", "spark.home" },
{ "SPARK_SERIALIZER", "spark.kryo.registrator" },
{ "HIVE_HOME", "hive.home" },
{ "HADOOP_HOME", "hadoop.home" },
{ "DDFSPARK_JAR", "ddfspark.jar" }
// @Formatter:on
};
Does that mean I cannot set the other spark parameters in ddf?
terrible if it is true!!