问题是,jar文件使用Spring ORM加载持久性配置,并且基于这些配置,文件被移动到HDFS中的合适文件夹。 现在,如果我使用'java -cp'而不是'hadoop jar',它将无法复制到HDFS,并出现FileSystem错误。
在使用hadoop jar命令调用jar时(注入了spring orm),异常如下:
线程“main”中的异常org.springframework.beans.factory.BeanCreationException:使用名称创建bean时出错
'org.springframework.dao.annotation.PersistenceExceptionTranslationPostProcessor#0'在类路径资源[applicationContext.xml]中定义
在类路径资源[applicationContext.xml]中定义名称为'entityManagerFactory'的bean时出错:init方法的调用失败; 嵌套异常是java.lang.IllegalStateException:名称'Persistance'的持久性单元定义冲突:file:/home/user/Desktop/ABC/apnJar.jar,file:/ tmp / hadoop-ABC / hadoop-unjar2841422106164401019 /
引起:java.lang.IllegalStateException:名称'Persistance'的持久性单元定义冲突
好像Hadoop将jar文件解压缩到某个tmp文件夹,这真的需要吗? 我们可以通过任何配置更改跳过此步骤吗?
对此有任何想法都是受欢迎的。
The problem is that, the jar file uses Spring ORM for loading the persistance configurations, and based on these configurations, files are moved to suitable folders in HDFS. Now If i use, 'java -cp' instead of 'hadoop jar', it fails to copy to HDFS, with FileSystem error.
While invoking the jar with hadoop jar command (having spring orm injected) the exception is as:
Exception in thread "main" org.springframework.beans.factory.BeanCreationException: Error creating bean with name
'org.springframework.dao.annotation.PersistenceExceptionTranslationPostProcessor#0' defined in class path resource [applicationContext.xml
Error creating bean with name 'entityManagerFactory' defined in class path resource [applicationContext.xml]: Invocation of init method failed; nested exception is java.lang.IllegalStateException: Conflicting persistence unit definitions for name 'Persistance': file:/home/user/Desktop/ABC/apnJar.jar, file:/tmp/hadoop-ABC/hadoop-unjar2841422106164401019/
Caused by: java.lang.IllegalStateException: Conflicting persistence unit definitions for name 'Persistance'
Seems like Hadoop is unpacking the jar file to some tmp folder, is this really required? Can we skip this step by any configuration change?
Any thoughts on this are welcome.
最满意答案
如果使用“hadoop jar”,hadoop将运行org.apache.hadoop.util.RunJar 。 RunJar会将你的jar解包到temp文件夹中(在你的情况下是/ tmp / hadoop-ABC / hadoop-unjar2841422106164401019 /)并将其加载到当前的类加载器中。 最后,它将调用您的主类来运行MapReduce应用程序。
你在CLASSPATH中添加了jar吗? 如果是这样,您将在类加载器中拥有jar和未打包的文件夹。 我想这就是春天抱怨的原因。
If you use "hadoop jar", hadoop will run org.apache.hadoop.util.RunJar. RunJar will unpackage your jar into a temp folder(In your case is /tmp/hadoop-ABC/hadoop-unjar2841422106164401019/) and load it in the current class loader. At last, it will invoke your main class to run your MapReduce application.
Did you add your jar in the CLASSPATH? If so, you will have your jar and the unpackaged folder in the class loader. I think that's why spring complains it.
更多推荐
发布评论