我正在16GB节点上运行Spark作业,但出现错误:
Container killed by YARN for exceeding memory limits. 5.6 GB of 5.5 GB physical memory used.
Consider boosting spark.yarn.executor.memoryOverhead or disabling yarn.nodemanager.vmem-check-enabled because of YARN-4714.
如何直接从控制台中增加容器内存限制?(不是在打开火花壳之前)
试一试
相关问题 更多 >
编程相关推荐