纱线迷你集群容器日志目录不包含系统日志文件

编程入门 行业动态 更新时间:2024-10-27 22:32:08
本文介绍了纱线迷你集群容器日志目录不包含系统日志文件的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧! 问题描述

我已经根据CDH 5.1.0中的hadoop 2.3.0设置了带有1个节点管理器,4个本地目录和4个日志目录等的YARN MapReduce迷你集群。它看起来或多或少工作。我未能实现的是从容器中进行系统日志记录。我看到了容器日志目录, stdout 和 stderr 文件,但不是 syslog 用MapReduce容器记录。适当的 stderr 警告我没有log4j配置并且不包含任何其他字符串:

log4j:WARN记录器(org.apache.hadoop.metrics2.impl.MetricsSystemImpl)找不到appender。 log4j:WARN请正确初始化log4j系统。 log4j:WARN请参阅 logging.apache/log4j /1.2/faq.html#noconfig 获取更多信息。

如何将正常日志记录添加到我的容器中?还有一次,这是YARN mini-cluaster。

任何一条建议或有用的观点?

只是以更少的肯定尝试的方式作为答案:

  • 是的,我确定日志目录是正确的,我看到容器日志目录和我的应用程序。
  • 是的,MapReduce作业正常工作。至少那些有望开展工作的人。
  • 小群集日志记录本身就是正常的方式,并且与我设置的一致。这只与容器有关。
  • 低层像DFS clsuter一样正常工作。我在这里甚至有HBase和ZK迷你集群,他们工作正常。只是我需要记录MapReduce作业的调试信息。
  • ,客户端配置和打包。

  • 客户端配置应该包含适用于YARN应用程序的类路径。在我的例子中,我将以下几行添加到 yarn-site.xml (请注意 $ HADOOP_COMMON_HOME 替换):
  • <属性> < name> yarn.application.classpath< / name> <值> $ HADOOP_COMMON_HOME / *,$ HADOOP_COMMON_HOME / lib / *< /值> < / property>

  • I已将以下变量定义添加到小型集群启动脚本中(值得注意的是,我已将所有小型集群服务器端JARS分为 ./ lib >相对于mini-集群启动脚本:

    BASE_PATH = pwd export HADOOP_COMMON_HOME = $ {BASE_PATH}

    不工作日志记录的根本原因是客户端map-reduce作业在新虚拟机内启动YARN不知道在哪里找到 hadoop-yarn-server-nodemanager.jar 其中包含 container-log4j.properties 文件然后负责容器默认日志配置,现在一切正常。

    I have setup YARN MapReduce mini-cluster with 1 node manager, 4 local and 4 log directories and so on based on hadoop 2.3.0 from CDH 5.1.0. It looks more or less working. What I failed to achieve is syslog logging from containers. I see container log directories, stdout and stderr files but not syslog with MapReduce container logging. Appropriate stderr warns I have no log4j configuration and contains no any other string:

    log4j:WARN No appenders could be found for logger (org.apache.hadoop.metrics2.impl.MetricsSystemImpl). log4j:WARN Please initialize the log4j system properly. log4j:WARN See logging.apache/log4j/1.2/faq.html#noconfig for more info.

    How can I add normal logging to my containers? Yet another time, it is YARN mini-cluaster.

    Any piece of advice or useful point?

    Just to lower amount of definitely tried ways as answers:

    • Yes, I'm sure logging directories are correct and I see correlation between container log directories and my applications.
    • Yes, MapReduce jobs work. At least those who are expected to work.
    • Mini-cluster logging itself is at normal way and in accordance to what I have setup. This is only related to containers.
    • Lower layers like DFS clsuter works normally. I have even HBase and ZK mini-clusters here and they work OK. Just I need logging for MapReduce jobs debugging.

    解决方案

    OK, finally happened to be about classpath, client configuration and packaging.

  • Client configuration SHALL include proper classpath for YARN applications. In my case I have added the following lines to yarn-site.xml (please note $HADOOP_COMMON_HOME substitution):
  • <property> <name>yarn.application.classpath</name> <value>$HADOOP_COMMON_HOME/*,$HADOOP_COMMON_HOME/lib/*</value> </property>

  • I have added the following variable definition to mini-cluster start-up script (it worth to note I have all mini-cluster server-side JARS into ./lib relatively to mini-cluster startup script:
  • BASE_PATH="pwd" export HADOOP_COMMON_HOME=${BASE_PATH}

    The root cause of not working logging was client map-reduce job starting inside new VM on YARN without knowledge where to locate hadoop-yarn-server-nodemanager.jar which contains container-log4j.properties file which is in turn responsible for container default logging configuration. Now everything is working fine.

更多推荐

纱线迷你集群容器日志目录不包含系统日志文件

本文发布于:2023-11-24 05:02:00,感谢您对本站的认可!
版权声明:本站内容均来自互联网,仅供演示用,请勿用于商业和其他非法用途。如果侵犯了您的权益请与我们联系,我们将在24小时内删除。
本文标签:日志   纱线   集群   容器   不包含

发布评论

评论列表 (有 0 条评论)
草根站长

>www.elefans.com

编程频道|电子爱好者 - 技术资讯及电子产品介绍!