如何使用python将流上传到AWS s3

编程入门 行业动态 更新时间:2024-10-28 12:20:05
本文介绍了如何使用python将流上传到AWS s3的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧! 问题描述

我想创建一个从S3获取zip文件(可能包含csv文件列表)的lambda,将其解压缩并上传回s3.由于lambda受内存/磁盘大小的限制,因此我必须将其从s3流传输回它.我使用python(boto3)在下面查看我的代码

I want to create a lambda that gets a zip file(which may contain a list of csv files) from S3, unzip it and upload back to s3. since lambda is limited by memory/disk size, I have to stream it from s3 and back into it. I use python (boto3) see my code below

count = 0 obj = s3.Object( bucket_name, key ) buffer = io.BytesIO(obj.get()["Body"].read()) print (buffer) z = zipfile.ZipFile(buffer) for x in z.filelist: with z.open(x) as foo2: print(sys.getsizeof(foo2)) line_counter = 0 out_buffer = io.BytesIO() for f in foo2: out_buffer.write(f) # out_buffer.writelines(f) line_counter += 1 print (line_counter) print foo2.name s3.Object( bucket_name, "output/"+foo2.name+"_output" ).upload_fileobj(out_buffer) out_buffer.close() z.close()

结果是,在存储桶中创建空文件.例如:如果文件:input.zip包含文件:1.csv,2.csv我进入存储桶中具有相应名称的2个空csv文件.另外,我不确定它是否确实可以流式传输文件,或者只是下载所有zip文件谢谢

result is, creating empty files in the bucket. for example: if file: input.zip contained files: 1.csv,2.csv i get in the bucket 2 empty csv files with the corresponding names. also, i'm not sure it indeed stream the files, or just download all the zip file thanks

推荐答案

您需要搜索,然后再上传到ByesIO文件的开头.

You need to seek back to the beginning of the ByesIO file before uploading.

out_buffer = io.BytesIO() for f in foo2: out_buffer.write(f) # out_buffer.writelines(f) line_counter += 1 out_buffer.seek(0) # Change stream position to beginning of file s3.Object( bucket_name, "output/"+foo2.name+"_output").upload_fileobj(out_buffer) out_buffer.close()

更多推荐

如何使用python将流上传到AWS s3

本文发布于:2023-10-26 10:17:30,感谢您对本站的认可!
本文链接:https://www.elefans.com/category/jswz/34/1529843.html
版权声明:本站内容均来自互联网,仅供演示用,请勿用于商业和其他非法用途。如果侵犯了您的权益请与我们联系,我们将在24小时内删除。
本文标签:如何使用   python   将流上   AWS

发布评论

评论列表 (有 0 条评论)
草根站长

>www.elefans.com

编程频道|电子爱好者 - 技术资讯及电子产品介绍!