I use the data processing pipeline constructed of
S3 + SNS + Lambda
becasue S3 can not send notificaiton out of its storage region so I made use of SNS to send S3 notification to Lambda in other region.
The lambda function coded with
from __future__ import print_function
import boto3
def lambda_handler (event, context):
input_file_bucket = event["Records"][0]["s3"]["bucket"]["name"]
input_file_key = event["Records"][0]["s3"]["object"]["key"]
input_file_name = input_file_bucket+"/"+input_file_key
s3=boto3.resource("s3")
obj = s3.Object(bucket_name=input_file_bucket, key=input_file_key)
response = obj.get()
return event #echo first key valuesdf
when I ran save and test, I got the following error
{
"stackTrace": [
[
"/var/task/lambda_function.py",
20,
"lambda_handler",
"response = obj.get()"
],
[
"/var/runtime/boto3/resources/factory.py",
394,
"do_action",
"response = action(self, *args, **kwargs)"
],
[
"/var/runtime/boto3/resources/action.py",
77,
"__call__",
"response = getattr(parent.meta.client, operation_name)(**params)"
],
[
"/var/runtime/botocore/client.py",
310,
"_api_call",
"return self._make_api_call(operation_name, kwargs)"
],
[
"/var/runtime/botocore/client.py",
395,
"_make_api_call",
"raise ClientError(parsed_response, operation_name)"
]
],
"errorType": "ClientError",
"errorMessage": "An error occurred (AccessDenied) when calling the GetObject operation: Access Denied"
}
I configured the lambda Role with
full S3 access
and set bucket policy on my target bucket
everyone can do anything(list, delete, etc.)
It seems that I haven't set policy well.
I had a similar problem, I solved it by attaching the appropriate policy to my user.
IAM -> Users -> Username -> Permissions -> Attach policy.
Also make sure you add the correct access key and secret access key, you can do so using AmazonCLI.