我正在使用cloudera quickstart vm。 我昨天开始玩谷歌云平台。 我正在尝试将cloudera hdfs中的数据复制到1. google云存储(gs:// bucket_name /)2。google cloud hdfs cluster(使用hdfs:// google_cluster_namenode:8020 /)
我按照本文中的说明设置了服务帐户身份验证并配置了我的cloudera core-site.xml
hadoop fs -cp hdfs://quickstart.cloudera:8020/path_to_copy/ gs://bucket_name/工作正常。 但是,我无法使用distcp复制到谷歌云存储。 我收到以下错误。 我知道这不是一个URI问题。 还有什么我想念的吗?
Error: java.io.IOException: File copy failed: hdfs://quickstart.cloudera:8020/path_to_copy/file --> gs://bucket_name/file at org.apache.hadoop.tools.mapred.CopyMapper.copyFileWithRetry(CopyMapper.java:284) at org.apache.hadoop.tools.mapred.CopyMapper.map(CopyMapper.java:252) at org.apache.hadoop.tools.mapred.CopyMapper.map(CopyMapper.java:50) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:787) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:415) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1671) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) Caused by: java.io.IOException: Couldn't run retriable-command: Copying hdfs://quickstart.cloudera:8020/path_to_copy/file to gs://bucket_name/file at org.apache.hadoop.tools.util.RetriableCommand.execute(RetriableCommand.java:101) at org.apache.hadoop.tools.mapred.CopyMapper.copyFileWithRetry(CopyMapper.java:280) ... 10 more Caused by: java.lang.IllegalArgumentException: java.net.URISyntaxException: Relative path in absolute URI: gs://bucket_name.distcp.tmp.attempt_1461777569169_0002_m_000001_2 at org.apache.hadoop.fs.Path.initialize(Path.java:206) at org.apache.hadoop.fs.Path.<init>(Path.java:116) at org.apache.hadoop.fs.Path.<init>(Path.java:94) at org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.getTmpFile(RetriableFileCopyCommand.java:233) at org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.doCopy(RetriableFileCopyCommand.java:107) at org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.doExecute(RetriableFileCopyCommand.java:100) at org.apache.hadoop.tools.util.RetriableCommand.execute(RetriableCommand.java:87) ... 11 more 我无法通过distcp连接谷歌云hdfs namenode; 我正在“重试连接到服务器”。 我找不到任何文档来配置cloudera hdfs集群和google cloud hdfs集群之间的连接。 我假设服务帐户auth也应该与谷歌hdfs一起使用。 是否有可用于在集群之间设置副本的参考文档? 是否有任何其他身份验证设置我缺少?I'm using cloudera quickstart vm. I started playing around with google cloud platform yesterday. I'm trying to copy data in cloudera hdfs to 1. google cloud storage (gs://bucket_name/) 2. google cloud hdfs cluster (using hdfs://google_cluster_namenode:8020/)
I set up service account authentication and configured my cloudera core-site.xml as instructed in this post
hadoop fs -cp hdfs://quickstart.cloudera:8020/path_to_copy/ gs://bucket_name/works fine. However, I'm not able to use distcp to copy to google cloud storage. I get the following error. I know it's not a URI issue. Is there anything else I'm missing?
Error: java.io.IOException: File copy failed: hdfs://quickstart.cloudera:8020/path_to_copy/file --> gs://bucket_name/file at org.apache.hadoop.tools.mapred.CopyMapper.copyFileWithRetry(CopyMapper.java:284) at org.apache.hadoop.tools.mapred.CopyMapper.map(CopyMapper.java:252) at org.apache.hadoop.tools.mapred.CopyMapper.map(CopyMapper.java:50) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:787) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:415) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1671) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) Caused by: java.io.IOException: Couldn't run retriable-command: Copying hdfs://quickstart.cloudera:8020/path_to_copy/file to gs://bucket_name/file at org.apache.hadoop.tools.util.RetriableCommand.execute(RetriableCommand.java:101) at org.apache.hadoop.tools.mapred.CopyMapper.copyFileWithRetry(CopyMapper.java:280) ... 10 more Caused by: java.lang.IllegalArgumentException: java.net.URISyntaxException: Relative path in absolute URI: gs://bucket_name.distcp.tmp.attempt_1461777569169_0002_m_000001_2 at org.apache.hadoop.fs.Path.initialize(Path.java:206) at org.apache.hadoop.fs.Path.<init>(Path.java:116) at org.apache.hadoop.fs.Path.<init>(Path.java:94) at org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.getTmpFile(RetriableFileCopyCommand.java:233) at org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.doCopy(RetriableFileCopyCommand.java:107) at org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.doExecute(RetriableFileCopyCommand.java:100) at org.apache.hadoop.tools.util.RetriableCommand.execute(RetriableCommand.java:87) ... 11 more I'm not able to get distcp to connect to google cloud hdfs namenode; I'm getting "Retrying connect to server". I couldn't find any documentation to configure connection between the cloudera hdfs cluster and google cloud hdfs cluster. I was under the assumption that the service account auth should work with google hdfs too. Is there a reference documentation I can use to set up copy between clusters? Is there any other authentication set up I'm missing?最满意答案
事实证明我必须修改防火墙规则以允许来自ip的tcp / http我正在运行distcp。 检查GCP计算实例上的网络防火墙。
It turns out I had to modify firewall rules to allow tcp/http from the ip I was running distcp on. Check the networking firewalls on GCP compute instances.
更多推荐
发布评论