When I run the following command:
spark-submit --name "My app" --master "local[*]" --py-files main.py --driver-memory 12g --executor-memory 12g
With the following code in my main.py
:
sc = SparkContext.getOrCreate()
print(sc.getConf().getAll())
Driver memory and executor memory do not appear in the configuration. Even though I'm in local mode, I guess I should at least have the driver memory in the configuration.
Any ideas why it is not the case?
question from:
https://stackoverflow.com/questions/65873182/why-driver-memory-is-not-in-my-spark-context-configuration 与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…