-
Notifications
You must be signed in to change notification settings - Fork 69
Open
Description
Hi, I'm using findspark on jupyter with spark 2.1.0, which works great. However, I am trying to increase the number of executor cores in standalone mode. I thought by default it uses all cores on my OSX(8 cores), but its only using 1. Currently tried:
spark = (SparkSession.builder\
.master("local[*]")\
.config("spark.sql.warehouse.dir", "target/spark-warehouse")\
.appName("Ad analysis")\
.enableHiveSupport()\
.getOrCreate())
spark.sparkContext.defaultParallelism <-- returns 1
Secondly, how do I provide a conf file when running such interactive shell with findspark
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
No labels