VOIDKAT

Using AWS Lambda to save files to AWS S3 using Node.js

September 10, 2019

AWS Lambda functions are great for writing serverless APIs that utilize AWS services such as S3 or RDS. Lambda can be summed up as “functions as a service”, in this post we will be writing a simple example of saving some string data to an S3 bucket, we will build upon this to eventually send some data to Amazon RDS, but this is a good starting point. This post will assume you have AWS set up already.

Setting up AWS Lambda Function

On AWS Console search for AWS Lambda, Click Create Function within the AWS Lambda, while you can pick to write a function from scratch we will use hello-world starter node.js function.

Enter a name for your function, for execution role pick Create a new role with basic Lambda permissions, you won’t be able to change the node.js code for now so just click Create Function. You will be taken to the configuration screen for your Lambda function.

Using the Configuration screen

The screen is separated into several sections; designer, function code, execution role, test and others. The four I mention are the most important ones.

The Designer shows a graphical representation of execution flow, from trigger, to function layers and what services are being used, by default all functions have Amazon CloudWatch Logs.

Function code shows your function’s code, note that you write code within the inline editor, upload .zip code bundle or upload a file from AWS S3. While this tutorial will be using node.js (8.10), you can pick from .NET to Python and Ruby.

Finally the important thing to note is the Handler, this is the code entry Lambda look for to execute, by default its index.handler, because the file is index.js and the code is exports.handler. So if you are writing your own code with say app.js and lambda you should replace it with app.lambda.

However it is recommended that you maintain the same approach as exports.handler = async (event, context) => { }; unless you know what you are doing.

The next important section is Execution Role, this section is important enough that we will be covering later.

Finally there are sections for setting up Environmental Variables which would be useful for adding API keys for external services. Other settings include descriptors, tags and networking sections.

Adding Function code

Let’s add our code to the inline editor, we will be saving some incoming data to a text file and save that on a bucket. As such we will need to include the AWS SDK, add our payload, setup parameters to our bucket action and finally do an await for a file save.

Note that we are using event, event will return any variables being sent into the Lambda function. More on this later.

For this function we will be receiving some id, an email and some arbitrary path as seen in the payload.

var AWS = require('aws-sdk');

exports.handler = async (event, context, call) => {
    
    let S3 = new AWS.S3({ region: process.env.AWS_REGION });
    
    var payload = {
        "id": event.id,
        "email":event.email,
        "path": event.path
    }

   var params = {
         Bucket: '<s3-bucket-name>',
         Key: event.id+'.txt',
         Body: JSON.stringify(payload),
         ContentType: 'text/plain',
    };
     
    try {
        let s3Response = await S3.upload(params).promise();

        let res = {
            'statusCode': 200,
            'headers': { 'Content-Type': 'application/json' },
            'body': JSON.stringify({
                "id": event.id,
                "email":event.email,
                "path": event.path,
                "s3Path":s3Response.Location
            })
        }
        
        return res; 

    } catch (error){
        
        let fail = {
            'statusCode': 200,
            'headers': { 'Content-Type': 'application/json' },
            'body': JSON.stringify({
                "error":error
            })
        }

        return fail;
    }
};

Don’t forget to Save your function.

Adding a test

Now it’s time to set up a test and see if our function is executing, click Configure test event on the top right. Make any name up for your test, and enter the following:

    {
    "id": "id_test",
    "email": "test@test.com",
    "path": "path/test"
    }

You can run the test, the execution results are rendered at the top and within the inline code window. You would see the Execution Result: succeeded. However the return will yield Access Denied. Now we will talk about execution roles and permissions.

Create S3 Bucket

Create an S3 bucket, under permissions turn off Block All Public Access, now click into Bucket Policy and add:

{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Sid": "AddPerm",
            "Effect": "Allow",
            "Principal": "*",
            "Action": "s3:GetObject",
            "Resource": "arn:aws:s3:::<your-bucket-name>/*"
        }
    ]
}

This policy will allow public access to objects saved to the S3 bucket. Allowing use to access them afterwards. Within Lambda, place the bucket name in your function code. Now let’s talk about Execution Roles:

Execution roles and permissions

Let’s talk about what we are trying to achieve, an incoming request with some data gets processed by the function and saved as text file on a AWS S3 bucket.

This is where execution roles come into play, AWS Lambda executes actions using a role defined via IAM. Where you can set up what services the function has access to. Click on the `View the XXXX role on IAM console. just below existing role. You will be taken to the IAM screen.

You will see the Permission Policies with only the AWSLambdaBasicExecutionRole policy attached. We will define our own policy for the role now. Click Add inline policy:

Since we are writing to S3 we will need to add S3 write permissions, so select S3 as Service and then in Actions we will pick the appropriate permissions. Pick PutObjectAcl which allows to put an object on S3 with new ACL permissions and the PutObject permission under Write. Next we will be asked about what resources and objects we are allowed to act, leave this on Any, though technically this is where you can restrict writes to a specific S3 bucket. Finally save your policy. Your policy set up will look like this:

{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Sid": "VisualEditor0",
            "Effect": "Allow",
            "Action": [
                "s3:PutObject",
                "s3:PutObjectAcl"
            ],
            "Resource": "*"
        }
    ]
}

Finally let’s add an API gateway to our Lambda function.

Adding an API Gateway

Now that we are satisifed with our function, tests and so on we need to add an trigger(end point), we will accomplish this by using API Gateway. Go back to console and search for API Gateway. This is where we can create API triggers, I find it best to create the APIs within API Gateway rather than the graphical configuration found on AWS Lambda. We want to achieve the following:

  • POST data via api/endpoint?id=test&email=test&path=test
  • Create request validation (all requests should have the id,email,path parameters)
  • Pass through any submitted data to the Lambda function.

But first let’s create the API itself. On the API Gateway screen, click Create API, on the next screen: Pick REST as an API, New API and pick a name. Leave the rest of the options as is and click Create API. This will create the API now and you will see it listed on the left hand pane.

Let’s create a method now and connect it to our Lambda function. Click on Resources and click on Actions and pick create Create Method. We will create POST API endpoint since we are submitting data to be saved.

On the setup screen, pick Lambda Function as Integration type, leave Use Lambda Proxy integration unchecked (As this will send a raw request to our lambda function). Give it a name and pick the region where Lambda function resides. Leave the rest of the options as default.

You will be presented with graphical representation of request journey. Client to Method Request to Integration RequestLambda function and so on back eventually to the client. Let’s add some request validation, click on Method Request

Adding Request validation

In Settings, leave everything as is but in Request Validator pick Validate body, query string parameters, and headers. Click on URL Query String Parameters and add email, id and path and set them to Required. Now all incoming request that do not have these parameters will be rejected.

Adding Mapping templates

Exit the screen and click on Integration Request, here we will add parameters to allow data to be passed to our Lambda function. Which will expose them when event is invoked from exports.handler = async (event, context) => { };.

Leave everything as default, navigate to the bottom to Mapping template. Pick When there are no templates defined (recommended), click Add mapping template and enter application\json. In the template paste the following:

{
    "id":"$input.params('id')",
    "email":"$input.params('email')",
    "path":"$input.params('path')"
}

The "$input.params('id')" string, means id will be passed to our Lambda function. Notice we are wrapping in the statement in qoutes which will mean id will be passed as a String to our Lambda function. This can be removed if you desire say a number. Click Save to save your mapping template.

Testing the API.

Let’s test that everything is working as expected, exit again and click on Test on Client, let’s test if our request validation is working and send nothing at first, simply click Test. You should see the following in response body:

{
  "message": "Missing required request parameters: [path, id, email]"
}

Now within query strings add a test id=test&email=test&path=test and click test you should see:

{
  "statusCode": 200,
  "headers": {
    "Content-Type": "application/json"
  },
  "body": "{
      \"id\":\"test\",
      \"email\":\"test\",
      \"path\":\"test\",
      \"s3Path\":\"https://<your-s3-buckt>.s3.us-west-2.amazonaws.com/test.txt\"
      }"
}

Deploy the API

Now we are ready to deploy our API endpoint. Click out and on Actions and pick Deploy API. You can pick any deployment stage name, be it development or production, it doesn’t matter and any description. You will be now navigated to Stages under your API. I picked production as my stage name.

You don’t need to change anything there, open your named stage name, find your POST method and copy the Invoke URL. Let’s test the endpoint using curl, use the following command, be sure to update the URL to your endpoint:

curl -v -X POST 'http://<your-invocation-url>us-west-2.amazonaws.com/production/?id=test&email=test&path=test'

You should see

* Connection #0 to host <your-invocation-url>.us-west-2.amazonaws.com left intact
{"statusCode":200,
    "headers":
        {"Content-Type":"application/json"},
        "body":"{
            \"id\":\"test\",
            \"email\":\"test\",
            \"path\":\"test\",
            \"s3Path\":\"https://<your-bucket-name>/.s3.us-west-2.amazonaws.com/test.txt\
        "}"}

End

That concludes using AWS Lambda for creating APIs. There is a lot more functionality I have not covered for that please check out the AWS Documentation for Lambda and API Gateway.


Farhad Agzamov

Written by Farhad Agzamov who lives and works in London building things. You can follow him on Twitter and check out his github here