AWS Lambda function write to S3

lambda write to s3 python
aws lambda upload file to s3
aws lambda update-function-code s3 example
aws lambda write to s3 java
aws lambda upload file to s3 node js
aws lambda zip files in s3 python
aws lambda read file from s3 python
aws lambda tutorial

I have a Node 4.3 Lambda function in AWS. I want to be able to write a text file to S3 and have read many tutorials about how to integrate with S3. However, all of them are about how to call Lambda functions after writing to S3.

How can I create a text file in S3 from Lambda using node? Is this possible? Amazons documentation doesn't seem to cover it.

Sample Amazon S3 function code, Amazon S3 can send an event to a Lambda function when an object is created or deleted. You configure notification settings on a bucket, and grant Amazon S3  To invoke your function, Amazon S3 needs permission from the function's resource-based policy. When you configure an Amazon S3 trigger in the Lambda console, the console modifies the resource-based policy to allow Amazon S3 to invoke the function if the bucket name and account ID match.

You can upload file on s3 using

aws-sdk

If you are using IAM user then you have to provide access key and secret key and make sure you have provided necessary permission to IAM user.

var AWS = require('aws-sdk');
AWS.config.update({accessKeyId: "ACCESS_KEY",secretAccessKey: 'SECRET_KEY'});
var s3bucket = new AWS.S3({params: {Bucket: 'BUCKET_NAME'}});
function uploadFileOnS3(fileName, fileData){
    var params = {
      Key: fileName,
      Body: fileData,
    };
    s3bucket.upload(params, function (err, res) {               
        if(err)
            console.log("Error in uploading file on s3 due to "+ err)
        else    
            console.log("File successfully uploaded.")
    });
}

Here I temporarily hard-coded AWS access and secret key for testing purposes. For best practices refer to the documentation.

Using AWS Lambda with Amazon S3, I want my AWS Lambda function to be able to access my Amazon Simple Storage Service (Amazon S3) bucket. How can I do that? Amazon S3 service is used for file storage, where you can upload or remove files. We can trigger AWS Lambda on S3 when there are any file uploads in S3 buckets. AWS Lambda has a handler function which acts as a start point for AWS Lambda function. The handler has the details of the events.

IAM Statement for serverless.com - Write to S3 to specific bucket

service: YOURSERVICENAME

provider:
  name: aws
  runtime: nodejs8.10
  stage: dev
  region: eu-west-1
  timeout: 60
  iamRoleStatements:
    - Effect: "Allow"
      Action:
       - s3:PutObject
      Resource: "**BUCKETARN**/*"
    - Effect: "Deny"
      Action:
        - s3:DeleteObject
      Resource: "arn:aws:s3:::**BUCKETARN**/*"

Grant a Lambda Execution Role Access to an Amazon S3 Bucket, Yes it is absolutely possible! var AWS = require('aws-sdk'); function putObjectToS3(bucket, key, data){ var s3 = new AWS.S3(); var params  I have a Node 4.3 Lambda function in AWS. I want to be able to write a text file to S3 and have read many tutorials about how to integrate with S3. However, all of them are about how to call Lambda

After long long time of silence-failing of 'Task timed out after X' without any good error message, i went back to the beginning, to Amazon default template example, and that worked!

> Lambda > Functions > Create function > Use a blueprints > filter: s3.

Here is my tweaked version of amazon example:

const aws = require('aws-sdk');
const s3 = new aws.S3({ apiVersion: '2006-03-01' });

async function uploadFileOnS3(fileData, fileName) => {
    const params = {
        Bucket:  "The-bucket-name-you-want-to-save-the-file-to",
        Key: fileName,
        Body: JSON.stringify(fileData),
    };

    try {
        const response = await s3.upload(params).promise();
        console.log('Response: ', response);
        return response;

    } catch (err) {
        console.log(err);
    }
};

AWS Lambda function write to S3, Yes it is absolutely possible! var AWS = require('aws-sdk'); function putObjectToS3(bucket, key, data){ var s3 = new AWS.S3(); var params = { Bucket : bucket,  Write File to S3 using Lambda S3 can be used as the content repository for objects and it maybe needed to process the files and also read and write files to a bucket. A good example being in a serverless architecture to hold the files in one bucket and then to process the files using lambda and write the processed files in another bucket.

node.js - AWS Lambda function write to S3, It also covers topics around accessing data in S3 and performance optimisation and website hosting. To write files to S3, the lambda function needs to be setup  2. Set the IAM role as the Lambda function's execution role. 3. Verify that the bucket policy grants access to the Lambda function's execution role. Important: If the IAM role that you create for the Lambda function is in the same AWS account as the bucket, then you don't need to grant Amazon S3 permissions on both the IAM role and the bucket

Write File to S3 using Lambda, We can trigger AWS Lambda on S3 when there are any file uploads in S3 buckets. AWS Lambda has a handler function which acts as a start point for AWS​  We will use to test our Lambda function. Now that you have an S3 bucket you can create one of the built-in (blueprint) Lambda functions and integrate it with your S3 bucket. Setup a blueprint Lambda function. This is really nice. AWS provides a number of sample Lambda functions you can setup. AWS calls these blueprints.

Using Lambda Function with Amazon S3, Use this code for doing that: var AWS = require('aws-sdk');. function putObjectToS3(bucket, key, data){. var s3 = new AWS.S3();. var params = {. You can give a Lambda function created in one account ("account A") permissions to assume a role from another account ("account B") to access resources such as an Amazon Simple Storage Service (Amazon S3) bucket, or to do tasks such as starting and stopping instances.

Comments
  • If your lambda function is executed inside a VPC you will have to create an endpoint for it. Before finding this out the callback of s3.putObject was never called. See this article about S3 VPC endpoint: aws.amazon.com/blogs/aws/new-vpc-endpoint-for-amazon-s3 See that one about accessing ressources from lambda: aws.amazon.com/blogs/aws/…
  • Could you specify how you would call this function? I'm guessing bucket comes from process.env.BUCKET_NAME, but what exactly is the key path, and data? What if you're writing the file in the Lambda first, and then putting it in a bucket?
  • As a late response to bildungsroman, key is anything that goes after your bucket name. For example, for https://mybucket.s3.amazonaws.com/path/to/image.jpg, the key is path/to/image.jpg.
  • Perfect thanks!
  • Nothing in the question requests specifically for access keys and secrets to be used for authentication. It is extremely bad practice to authenticate in this way and as such I don't feel recommending it as a solution is wise. Lambda functions should be provisioned with an IAM Role that is sufficient for the access the Lambda Function requires to perform it's function.
  • I know i just put the secret key and access key hard coded but this method only for small personal scripts or for testing purposes.
  • Thanks. I am using serverless framework. This worked for me.