Access Denied using boto3 through aws Lambda

lambda s3 access denied
an error occurred (accessdenied) when calling the getobject operation: access denied
lambda s3:putobject access denied
serverless s3 access denied
s3 access denied public bucket
cloudformation give lambda access to s3
allow lambda to access s3
s3 upload access denied

I use the data processing pipeline constructed of

S3 + SNS + Lambda

becasue S3 can not send notificaiton out of its storage region so I made use of SNS to send S3 notification to Lambda in other region.

The lambda function coded with

from __future__ import print_function
import boto3


def lambda_handler (event, context):
    input_file_bucket = event["Records"][0]["s3"]["bucket"]["name"]
    input_file_key = event["Records"][0]["s3"]["object"]["key"]

    input_file_name = input_file_bucket+"/"+input_file_key

    s3=boto3.resource("s3")
    obj = s3.Object(bucket_name=input_file_bucket, key=input_file_key)
    response = obj.get()

    return event #echo first key valuesdf

when I ran save and test, I got the following error

    {
  "stackTrace": [
    [
      "/var/task/lambda_function.py",
      20,
      "lambda_handler",
      "response = obj.get()"
    ],
    [
      "/var/runtime/boto3/resources/factory.py",
      394,
      "do_action",
      "response = action(self, *args, **kwargs)"
    ],
    [
      "/var/runtime/boto3/resources/action.py",
      77,
      "__call__",
      "response = getattr(parent.meta.client, operation_name)(**params)"
    ],
    [
      "/var/runtime/botocore/client.py",
      310,
      "_api_call",
      "return self._make_api_call(operation_name, kwargs)"
    ],
    [
      "/var/runtime/botocore/client.py",
      395,
      "_make_api_call",
      "raise ClientError(parsed_response, operation_name)"
    ]
  ],
  "errorType": "ClientError",
  "errorMessage": "An error occurred (AccessDenied) when calling the GetObject operation: Access Denied"
}

I configured the lambda Role with

full S3 access

and set bucket policy on my target bucket

everyone can do anything(list, delete, etc.)

It seems that I haven't set policy well.


Possibility of the specific S3 object which you are looking for is having limited permissions

Resolve Access Denied Errors When Using Lambda Functions to , You receive an Access Denied error when the permissions between the AWS Lambda function and the Amazon S3 bucket are incomplete or  I am trying to read the json file from my s3 bucket using lambda function. I am getting Access Denied with below error: Starting new HTTPS connection (1): test-dev-cognito-settings-us-west-2.s3.u


I had a similar problem, I solved it by attaching the appropriate policy to my user.

IAM -> Users -> Username -> Permissions -> Attach policy.

Also make sure you add the correct access key and secret access key, you can do so using AmazonCLI.

Resolve Access Denied Errors When Using an AWS SDK, For more information on providing credentials to Boto3, see Method Parameters. If the AWS CLI and the AWS SDK are configured with different  You never need to use AWS access keys when you are using one AWS resource within another. Just allow the Lambda function to access the S3 bucket and any actions that you want to take (e.g. PutObject).


Omuthu's answer actually correctly identified my problem, but it didn't provide a solution so I thought I'd do that.

It's possible that when you setup your permissions in IAM you made something like this:

{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Effect": "Allow",
            "Action": [
                "s3:PutObject",
                "s3:GetObject",
                "s3:DeleteObject",
                "s3:ListBucket"
            ],
            "Resource": [
                "arn:aws:s3:::test"
            ]
        }
    ]
}

Unfortunately, that's not correct. You need to apply the Object permissions to the objects in the bucket. So it has to look like this:

{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Effect": "Allow",
            "Action": [
                "s3:ListBucket"
            ],
            "Resource": [
                "arn:aws:s3:::test"
            ]
        },
        {
            "Effect": "Allow",
            "Action": [
                "s3:PutObject",
                "s3:GetObject",
                "s3:DeleteObject"
            ],
            "Resource": [
                "arn:aws:s3:::test/*"
            ]
        }
    ]
}

Note the second ARN witht the /* at the end of it.

Uploading a file to S3 using, Uploading a file to S3 using Python/Boto3 and CodePipeline in AWS forums and is a question about how to fix an "Access Denied" error that Cloudwatch gives, and the IAM policy that is attached to the lambda function. Boto3 is the name of the Python SDK for AWS. It allows you to directly create, update, and delete AWS resources from your Python scripts. If you’ve had some AWS exposure before, have your own AWS account, and want to take your skills to the next level by starting to use AWS services from within your Python code, then keep reading.


Adding to Amri's answer, if your bucket is private and you have the credentials to access it you can use the boto3.client:

import boto3
s3 = boto3.client('s3',aws_access_key_id='ACCESS_KEY',aws_secret_access_key='SECRET_KEY')
response = s3.get_object(Bucket='BUCKET', Key='KEY')

*For this file: s3://bucket/a/b/c/some.text, Bucket is 'bucket' and Key is 'a/b/c/some.text'

---EDIT---

You can easily change the script to accept keys as environment variables for instance so they are not hardcoded. I left it like this for simplicity

Troubleshoot 403 Access Denied Errors from Amazon S3, Permissions for bucket and object owners across AWS accounts; Issues in bucket policy or AWS Identity and Access Management (IAM) user  You can use AWS Identity and Access Management (IAM) to manage access to the Lambda API and resources like functions and layers. For users and applications in your account that use Lambda, you manage permissions in a permissions policy that you can apply to IAM users, groups, or roles.


I had similar problem, the difference was the bucket was encrypted in KMS key.

Fixed with: IAM -> Encryption keys -> YOUR_AWS_KMS_KEY -> to your policy or account

Configure a Lambda Function to Assume a Role from Another Account, If you haven't already, configure these two AWS Identity and Access Assumed role – A role in account B that the Lambda function in account A see assume_role in the AWS SDK for Python (Boto 3) documentation. To view a function's resource-based policy. Open the Lambda console Functions page.. Choose a function. Choose Permissions.. The resource-based policy shows the permissions that are applied when another account or AWS service attempts to access the function.


AWS Permissions: Lambda access Denied to S3, When I test in Cloud 9 the Python codes runs fine and writes to the S3 bucket perfectly. When I push this to the Lambda function and it runs I think  AWS Lambda guide part II – Access to S3 service from Lambda function In previous chapter I talked a little what is AWS Lambda and idea behind serverless computing. Furthermore I presented small Python application I wrote to sign certificate requests using my CA authority certificate (how to create such you can find in my post How to act as


Access Permissions, The example retrieves the current access control list of an S3 bucket. import boto3 # Retrieve a bucket's ACL s3 = boto3. Using AWS Lambda with Amazon API Gateway You can create a web API with an HTTP endpoint for your Lambda function by using Amazon API Gateway. API Gateway provides tools for creating and documenting web APIs that route HTTP requests to Lambda functions.


Intermittent 'PermissionError: Access Denied' when trying to read S3 , I'm running a Python 3.7 script in AWS Lambda, which runs queries against the title Intermittent 'PermissionError: Access Denied' on AWS Lambda I am trying with s3fs==0.3.0 and boto3==1.9.188 (released at roughly the  I want my AWS Lambda function to be able to access my Amazon Simple Storage Service (Amazon S3) bucket. Create an AWS Identity and Access Management (IAM) role for the Lambda function that grants access to the S3 bucket. Modify the IAM role's trust policy. Set the IAM role as the Lambda function's execution role.