更快的s3存储桶复制
我一直在尝试寻找一种比 s3cmd 更好的命令行工具来复制存储桶. s3cmd
可以复制存储桶,而不必下载和上传每个文件.我通常使用s3cmd运行以复制存储桶的命令是:
I have been trying to find a better command line tool for duplicating buckets than s3cmd. s3cmd
can duplicate buckets without having to download and upload each file. The command I normally run to duplicate buckets using s3cmd is:
s3cmd cp -r --acl-public s3://bucket1 s3://bucket2
这可行,但是它非常慢,因为它一次通过API复制了每个文件.如果s3cmd
可以并行运行,
This works, but it is very slow as it copies each file via the API one at a time. If s3cmd
could run in parallel mode, I'd be very happy.
人们是否还可以使用其他选项作为命令行工具或代码来复制比s3cmd
更快的存储桶?
Are there other options available as a command line tools or code that people use to duplicate buckets that are faster than s3cmd
?
看起来 s3cmd-modification 正是我想要的.太糟糕了,它不起作用.还有其他选择吗?
Looks like s3cmd-modification is exactly what I'm looking for. Too bad it does not work. Are there any other options?
AWS CLI似乎可以完美地完成工作,并且具有被官方支持的工具的优点.
AWS CLI seems to do the job perfectly, and has the bonus of being an officially supported tool.
aws s3 sync s3://mybucket s3://backup-mybucket
http://docs.aws.amazon.com/cli/latest/reference/s3/sync.html
默认情况下支持并发传输.请参见 http://docs.aws. amazon.com/cli/latest/topic/s3-config.html#max-concurrent-requests
Supports concurrent transfers by default. See http://docs.aws.amazon.com/cli/latest/topic/s3-config.html#max-concurrent-requests
要快速传输大量小文件,请从EC2实例运行脚本以减少延迟,并增加max_concurrent_requests
以减少延迟的影响.例如:
To quickly transfer a huge number of small files, run the script from an EC2 instance to decrease latency, and increase max_concurrent_requests
to reduce the impact of latency. Eg:
aws configure set default.s3.max_concurrent_requests 200