当spark任务的数量大于执行者核心时会发生什么?Spark如何处理这种情况
What happens when number of spark tasks be greater than the executor core? How is this scenario handled by Spark
推荐答案这与
Is this related to this question?
无论如何,您可以检查 Cloudera使用方法.在调整资源分配"部分中,将说明Spark应用程序可以通过打开动态分配属性来请求执行者.设置群集属性(例如num-executors,executor-cores,executor-memory ...)也很重要,这样火花请求才能适合您的资源管理器可用.
Anyway, you can check this Cloudera How-to. In "Tuning Resource Allocation" section, It's explained that a spark application can request executors by turning on the dynamic allocation property. It's also important to set cluster properties such as num-executors, executor-cores, executor-memory... so that spark requests fit into what your resource manager has available.
更多推荐
Spark任务的数量可以大于执行者核心吗?
发布评论