I have had an S3 bucket for awhile but only now turned versioning on. Experimenting with it a bit trying to figure out what sort of detection protections I am getting with just the versioning on, without activating the "MFA delete" option.
I uploaded a test file, then deleted it then re-uploaded it twice. Now, using the S3 browser tool I am seeing 4 versions of the file: #1, #2 (deleted), #3 and #4 (current). If I use boto
to get the latest version, I can extract its version_id
:
import boto
c=boto.connect_s3()
b=c.get_bucket('my-bucket')
k = b.get_key('test2/dw.txt')
print k.version_id
But how do i get a full list of version_id's for a given key? And if I want to retrieve version #1 of the key (deleted), do I need to do something first using the version #2 id to "undelete" it?
Finally, does this deletion protection (creation of a delete marker) work with legacy files that had been uploaded before versioning was turned on?
Thx
You can get a list of all available versions by using the list_versions
method of the bucket object.
import boto
c = boto.connect_s3()
bucket = c.get_bucket('my-bucket')
for version in bucket.list_versions():
print(version)
This will return a list of Key
objects which have specific version_ids
associated with them. You can retrieve any of the versions but using the normal methods on the Key
object. If you want to make the older version the current version you would have to re-upload it or copy it on the server.
Once you enable versioning on a bucket, all delete operations after that point in time on any object in the bucket will result in a delete marker being written to the bucket rather than actually deleting the object.