写入HDFS:文件被覆盖

编程入门 行业动态 更新时间:2024-10-22 13:36:59
本文介绍了写入HDFS:文件被覆盖的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧! 问题描述 我正在写信给hadoop文件系统。但每次追加内容时,它都会覆盖数据而不是将其添加到现有的数据/文件中。下面提供了这样做的代码。对于不同的数据,这个代码被一次又一次地调用。每次出现问题时都打开一个新的SequenceFile.Writer?

每次我将路径作为新路径(someDir);

public void writeToHDFS(Path path,long uniqueId,String data){ FileSystem fs = path.getFileSystem(conf); SequenceFile.Writer inputWriter = new SequenceFile.Writer(fs,conf, path,LongWritable.class,MyWritable.class); inputWriter.append(new LongWritable(uniqueId ++),new MyWritable(data)); inputWriter.close();

解决方案

目前没有办法通过API附加到现有的SequenceFile。当创建新的 SequenceFile.Writer 对象时,它不会追加到 Path 中的现有文件,而是覆盖它。查看我的早期问题。

正如Thomas指出的那样,如果您保留相同的 SequenceFile.Writer 对象,您将能够附加到该文件,直到您致电 close()。

I am writing to hadoop file system. But everytime I append something, it overwrites the data instead of adding it to the existing data/file. The code which is doing this is provided below. This code is called again and again for different data. Is opening a new SequenceFile.Writer everytime a problem?

Each time I am getting the path as new Path("someDir");

public void writeToHDFS(Path path, long uniqueId, String data){ FileSystem fs = path.getFileSystem(conf); SequenceFile.Writer inputWriter = new SequenceFile.Writer(fs, conf, path, LongWritable.class, MyWritable.class); inputWriter.append(new LongWritable(uniqueId++), new MyWritable(data)); inputWriter.close(); }

解决方案

There is currently no way to append to an existing SequenceFile through the API. When you make the new SequenceFile.Writer object, it will not append to an existing file at that Path, but instead overwrite it. See my earlier question.

As Thomas points out, if you keep the same SequenceFile.Writer object, you will be able to append to the file until you call close().

更多推荐

写入HDFS:文件被覆盖

本文发布于:2023-11-08 07:43:14,感谢您对本站的认可!
版权声明:本站内容均来自互联网,仅供演示用,请勿用于商业和其他非法用途。如果侵犯了您的权益请与我们联系,我们将在24小时内删除。
本文标签:被覆   文件   HDFS

发布评论

评论列表 (有 0 条评论)
草根站长

>www.elefans.com

编程频道|电子爱好者 - 技术资讯及电子产品介绍!