I have AWS S3 access and the bucket has nearly 300 files inside the bucket. I need to download single file from this bucket by pattern matching or search because i do not know the exact filename (Say files ends with .csv format).
Here is my sample code which shows all files inside the bucket
def s3connection(credentialsdict):
"""
:param access_key: Access key for AWS to establish S3 connection
:param secret_key: Secret key for AWS to establish S3 connection
:param file_name: file name of the billing file(csv file)
:param bucket_name: Name of the bucket which consists of billing files
:return: status, billing_bucket, billing_key
"""
os.environ['S3_USE_SIGV4'] = 'True'
conn = S3Connection(credentialsdict["access_key"], credentialsdict["secret_key"], host='s3.amazonaws.com')
billing_bucket = conn.get_bucket(credentialsdict["bucket_name"], validate=False)
try:
billing_bucket.get_location()
except S3ResponseError as e:
if e.status == 400 and e.error_code == 'AuthorizationHeaderMalformed':
conn.auth_region_name = ET.fromstring(e.body).find('./Region').text
billing_bucket = conn.get_bucket(credentialsdict["bucket_name"])
print billing_bucket
if not billing_bucket:
raise Exception("Please Enter valid bucket name. Bucket %s does not exist"
% credentialsdict.get("bucket_name"))
for key in billing_bucket.list():
print key.name
del os.environ['S3_USE_SIGV4']
Can I pass search string to retrieve the exact matched filenames?
There is no way to do this because there is no native support for regex
in S3. You have to get the entire list and apply the search/regex at the client side. The only filtering option available in list_objects
is by prefix
.
Prefix (string) -- Limits the response to keys that begin with the specified prefix.
One option is to use the Python module re
and apply it to the list of objects.
import re
pattern = re.compile(<file_pattern_you_are_looking_for>)
for key in billing_bucket.list():
if pattern.match(key.name):
print key.name