线程“主”中的异常java.lang.NoClassDefFoundError:org / apache / hadoop / hbase / HBaseConfiguration

编程入门 行业动态 更新时间:2024-10-28 00:20:37
本文介绍了线程“主”中的异常java.lang.NoClassDefFoundError:org / apache / hadoop / hbase / HBaseConfiguration的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧! 问题描述

我使用Hadoop 1.0.3和HBase 0.94.22。我试图运行一个映射程序来读取Hbase表中的值并将它们输出到一个文件中。我收到以下错误:

线程main中的异常java.lang.NoClassDefFoundError: org / apache / hadoop / hbase / HBaseConfiguration $ b $ java.util.Class.forName0(Native Method)$ b $ java.util.Class.forName(Class.java:340) at org .apache.hadoop.util.RunJar.main(RunJar.java:149)导致:java.lang.ClassNotFoundException:org.apache.hadoop.hbase.HBaseConfiguration $ b $ java.URLClassLoader $ 1 .run(URLClassLoader.java:372)在java.URLClassLoader上 $ 1.run(URLClassLoader.java:361)$ b $在java.security.AccessController.doPrivileged(本地方法) at java.URLClassLoader.findClass(URLClassLoader.java:360)$ java.util.ClassLoader.loadClass中的$ b $(ClassLoader.java:424)$ java.util.ClassLoader.loadClass中的(ClassLoader.java:代码如下: $ $ b pre> import java.io.IOException; 导入org.apache.hadoop.fs.Path; 导入org.apache.hadoop.hbase.HBaseConfiguration; 导入org.apache.hadoop.hbase.client.Result; 导入org.apache.hadoop.hbase.client.Scan; import org.apache.hadoop.hbase.filter.FirstKeyOnlyFilter; import org.apache.hadoop.hbase.io.ImmutableBytesWritable; import org.apache.hadoop.hbase.mapreduce.TableMapReduceUtil; 导入org.apache.hadoop.hbase.mapreduce.TableMapper; import org.apache.hadoop.hbase.util.Bytes; import org.apache.hadoop.io.IntWritable; import org.apache.hadoop.io.Text; import org.apache.hadoop.mapreduce.Job; import org.apache.hadoop.mapreduce.lib.output.FileOutputFormat; public class Test { static class TestMapper扩展TableMapper< Text,IntWritable> { private static final IntWritable one = new IntWritable(1); $ b $ public void map(ImmutableBytesWritable row,Result value,Context context)throws IOException,InterruptedException { ImmutableBytesWritable userkey = new ImmutableBytesWritable(row.get(),0,Bytes。 SIZEOF_INT); String key = Bytes.toString(userkey.get()); context.write(new Text(key),one); $ public static void main(String [] args)throws Exception { HBaseConfiguration conf =新的HBaseConfiguration(); 工作职位=新职位(conf,hbase_freqcounter); job.setJarByClass(Test.class); 扫描扫描=新扫描(); FileOutputFormat.setOutputPath(job,new Path(args [0])); String columns =data; scan.addFamily(Bytes.toBytes(columns)); scan.setFilter(new FirstKeyOnlyFilter()); TableMapReduceUtil.initTableMapperJob(test,scan,TestMapper.class,Text.class,IntWritable.class,job); job.setOutputKeyClass(Text.class); job.setOutputValueClass(IntWritable.class); System.exit(job.waitForCompletion(true)?0:1); } }

上面的代码导出到jar文件并在命令行上使用下面的命令来运行上面的代码。

hadoop jar /home/testdb.jar test

>

其中test是映射器结果应写入的文件夹。

我检查了其他一些链接,如引起:java.lang.ClassNotFoundException:org.apache.zookeeper.KeeperException 在哪里建议将zookeeper文件包含在类路径中,但是在eclipse中创建项目时我已经包含了来自hbase的lib目录的zookeeper文件。我所包含的文件是zookeeper-3.4.5.jar。 Ans还访问了此链接 HBase - Java中的java.lang.NoClassDefFoundError ,但我正在使用映射器类从hbase表中获取值,而不是任何客户端API。我知道我在某个地方犯了一个错误,请大家帮我一下吗?

我注意到另一个奇怪的事情,当我删除主体中的所有代码函数,除了第一行HBaseConfiguration conf = new HBaseConfiguration();,然后将代码导出到jar文件并尝试将该jar文件编译为hadoop jar test.jar,我仍然得到相同的错误。看来我要么定义conf变量的方式不正确,要么是我的环境出现了问题。

解决方案

问题,我没有在hadoop-env.sh文件中添加hbase类路径。下面是我添加的工作。

$ export HADOOP_CLASSPATH = $ HBASE_HOME / hbase-0.94.22.jar :\ $ HBASE_HOME / hbase-0.94.22-test.jar:\ $ HBASE_HOME / conf:\ $ {HBASE_HOME} /lib/zookeeper-3.4.5。 jar:\ $ {HBASE_HOME} /lib/protobuf-java-2.4.0a.jar:\ $ {HBASE_HOME} /lib/guava-11.0.2.jar

I am using Hadoop 1.0.3 and HBase 0.94.22. I am trying to run a mapper program to read values from a Hbase table and output them to a file. I am getting the following error:

Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/hbase/HBaseConfiguration at java.lang.Class.forName0(Native Method) at java.lang.Class.forName(Class.java:340) at org.apache.hadoop.util.RunJar.main(RunJar.java:149) Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.hbase.HBaseConfiguration at java.URLClassLoader$1.run(URLClassLoader.java:372) at java.URLClassLoader$1.run(URLClassLoader.java:361) at java.security.AccessController.doPrivileged(Native Method) at java.URLClassLoader.findClass(URLClassLoader.java:360) at java.lang.ClassLoader.loadClass(ClassLoader.java:424) at java.lang.ClassLoader.loadClass(ClassLoader.java:357)

The code is as below

import java.io.IOException; import org.apache.hadoop.fs.Path; import org.apache.hadoop.hbase.HBaseConfiguration; import org.apache.hadoop.hbase.client.Result; import org.apache.hadoop.hbase.client.Scan; import org.apache.hadoop.hbase.filter.FirstKeyOnlyFilter; import org.apache.hadoop.hbase.io.ImmutableBytesWritable; import org.apache.hadoop.hbase.mapreduce.TableMapReduceUtil; import org.apache.hadoop.hbase.mapreduce.TableMapper; import org.apache.hadoop.hbase.util.Bytes; import org.apache.hadoop.io.IntWritable; import org.apache.hadoop.io.Text; import org.apache.hadoop.mapreduce.Job; import org.apache.hadoop.mapreduce.lib.output.FileOutputFormat; public class Test { static class TestMapper extends TableMapper<Text, IntWritable> { private static final IntWritable one = new IntWritable(1); public void map(ImmutableBytesWritable row, Result value, Context context) throws IOException, InterruptedException { ImmutableBytesWritable userkey = new ImmutableBytesWritable(row.get(), 0 , Bytes.SIZEOF_INT); String key =Bytes.toString(userkey.get()); context.write(new Text(key), one); } } public static void main(String[] args) throws Exception { HBaseConfiguration conf = new HBaseConfiguration(); Job job = new Job(conf, "hbase_freqcounter"); job.setJarByClass(Test.class); Scan scan = new Scan(); FileOutputFormat.setOutputPath(job, new Path(args[0])); String columns = "data"; scan.addFamily(Bytes.toBytes(columns)); scan.setFilter(new FirstKeyOnlyFilter()); TableMapReduceUtil.initTableMapperJob("test",scan, TestMapper.class, Text.class, IntWritable.class, job); job.setOutputKeyClass(Text.class); job.setOutputValueClass(IntWritable.class); System.exit(job.waitForCompletion(true)?0:1); } }

I get the above code exported to a jar file and on the command line I use the below command to run the above code.

hadoop jar /home/testdb.jar test

where test is the folder to which the mapper results should be written.

I have checked a few other links like Caused by: java.lang.ClassNotFoundException: org.apache.zookeeper.KeeperException where it has been suggested to include the zookeeper file in the classpath, but while creating the project in eclipse I have already included zookeeper file from the lib directory of hbase. The file I have included is zookeeper-3.4.5.jar. Ans also visited this link too HBase - java.lang.NoClassDefFoundError in java , but I am using a mapper class to get the values from the hbase table not any client API. I know I am making a mistake somewhere, guys could you please help me out ??

I have noted another strange thing, when I remove all of the code in the main function except the first line " HBaseConfiguration conf = new HBaseConfiguration();", then export the code to a jar file and try to compile the jar file as hadoop jar test.jar I still get the same error. It seems either I am defining the conf variable incorrectly or there is some issue with my environment.

解决方案

I got the fix to the problem, I had not added the hbase classpath in the hadoop-env.sh file. Below is the one I added to make the job work.

$ export HADOOP_CLASSPATH=$HBASE_HOME/hbase-0.94.22.jar:\ $HBASE_HOME/hbase-0.94.22-test.jar:\ $HBASE_HOME/conf:\ ${HBASE_HOME}/lib/zookeeper-3.4.5.jar:\ ${HBASE_HOME}/lib/protobuf-java-2.4.0a.jar:\ ${HBASE_HOME}/lib/guava-11.0.2.jar

更多推荐

线程“主”中的异常java.lang.NoClassDefFoundError:org / apache / hadoop / hbase / HBaseCon

本文发布于:2023-05-31 10:02:14,感谢您对本站的认可!
版权声明:本站内容均来自互联网,仅供演示用,请勿用于商业和其他非法用途。如果侵犯了您的权益请与我们联系,我们将在24小时内删除。
本文标签:线程   异常   lang   java   NoClassDefFoundError

发布评论

评论列表 (有 0 条评论)
草根站长

>www.elefans.com

编程频道|电子爱好者 - 技术资讯及电子产品介绍!