I am thinking about redeploying my static website to Amazon S3. I need to automate the deployment so I was looking for an API for such tasks. I'm a bit confused over the different options.
Question: What is the difference between s3cmd, the Python library boto and AWS CLI?
s3cmd and AWS CLI are both command line tools. They're well suited if you want to script your deployment through shell scripting (e.g. bash).
AWS CLI gives you simple file-copying abilities through the "s3" command, which should be enough to deploy a static website to an S3 bucket. It also has some small advantages such as being pre-installed on Amazon Linux, if that was where you were working from (it's also easily installable through pip
).
One AWS CLI command that may be appropriate to sync a local directory to an S3 bucket:
$ aws s3 sync . s3://mybucket
Full documentation on this command: http://docs.aws.amazon.com/cli/latest/reference/s3/sync.html
Edit: As mentioned by @simon-buchan in a comment, the aws s3api
command gives you access to the complete S3 API, but its interface is more "raw".
s3cmd supports everything AWS CLI does, plus adds some more extended functionality on top, although I'm not sure you would require any of it for your purposes. You can see all its commands here: http://s3tools.org/usage
Installation of s3cmd may be a bit more involved because it doesn't seem to be packages for it in any distros main repos.
boto is a Python library, and in fact the official AWS Python SDK. The AWS CLI, also being written in Python, actually uses part of the boto library (botocore). It would be well suited only if you were writing your deployment scripts in Python. There are official SDKs for other popular languages (Java, PHP, etc.) should you prefer: http://aws.amazon.com/tools/
The rawest form of access to S3 is through AWS's REST API. Everything else is built upon it at some point. If you feel adventurous, here's the S3 REST API documentation: http://docs.aws.amazon.com/AmazonS3/latest/API/APIRest.html