I need to send backup files of ~2TB to S3. I guess the most hassle-free option would be Linux scp command (have difficulty with s3cmd and don't want an overkill java/RoR to do so).
However I am not sure whether it is possible: How to use S3's private and public keys with scp, and don't know what would be my destination IP/url/path?
I appreciate your hints.
As of 2015, SCP/SSH is not supported (and probably never will be for the reasons mentioned in the other answers).
command line tool (pip3 install awscli
) - note credentials need to be specified, I prefer via environment variables rather than a file: AWS_ACCESS_KEY_ID
, AWS_SECRET_ACCESS_KEY
.
aws s3 cp /tmp/foo/ s3://bucket/ --recursive --exclude "*" --include "*.jpg"
and an rsync-like command:
aws s3 sync . s3://mybucket
Web interface:
Any other solutions depend on third-party executables (e.g. botosync, jungledisk...) which can be great as long as they are supported. But third party tools come and go as years go by and your scripts will have a shorter shelf life.
EDIT: Actually, AWS CLI is based on botocore:
https://github.com/boto/botocore
So botosync deserves a bit more respect as an elder statesman than I perhaps gave it.