--num-executors 150 --executor-cores 6 --executor-memory 30720M --driver-memory 36320M --conf spark.yarn.executor.memoryOverhead=4144
3
|
Number of executors is the number of distinct yarn containers (think processes/JVMs) that will execute your application.
Number of executor-cores is the number of threads you get inside each executor (container).
So the parallelism (number of concurrent threads/tasks running) of your spark application is
#executors X #executor-cores . If you have 10 executors and 5 executor-cores you will have (hopefully) 50 tasks running at the same time. |
No comments:
Post a Comment