Building a Node.js Service with AWS Lambda, DynamoDB, and Serverless Framework

Matthew Brown AWS, Cloud, JavaScript, Node, Tutorial Leave a Comment

As a developer, my favorite new technology is serverless computing. The convenience and cost make it a compelling choice for running options in the cloud–Especially for proofs-of-concept or quick ways to prove out ideas. Using serverless computing to get up and running takes very little effort and the costs of running an application in the cloud are minimal. Serverless really empowers developers to act on ideas as quickly as possible.

In this post, I’m going to briefly touch on what serverless computing is and the pros and cons of using it. Then I will build a Node.js service to do CRUD operations using AWS Lambda, DynamoDB, and the Serverless Framework. You can view the finished product on Github.

What is Serverless Computing?

First off, the Serverless Application is horribly misnamed. If you are running code, of course, there is a server somewhere. But if you’re running your code on AWS Lambda or as an Azure function, then the server/machine management part is completely abstracted away.

You don’t have to provision a VM or set up automatic scaling. All of those infrastructure concerns are handled for you. All you have to focus on is your application’s code. That is a huge benefit if you are just looking to get an application deployed on a server in a hurry for others to see.

The other big win is cost. If you are deploying to a VM you will pay for the entire capacity of that VM for as long as it is running. For Serverless, you are just paying per request and amount of time your code runs. This results in significant cost savings.


The benefits of serverless applications are as follows:

  • Cost – only pay for what you use
  • Simplicity – no managing infrastructure
  • Support for many popular programming languages
  • Use any cloud provider
  • Variety of options for triggers for running serverless code (API endpoint, message queue, timer, etc.)


There are also drawbacks to consider when building serverless applications:

  • Managing/organizing code can be difficult
  • Debugging locally can be a challenge
  • Application code must be stateless (no access to a file system)
  • Changing cloud vendors will likely require re-writing code
  • Cold starts

In the next section, I will discuss the Serverless Framework and address the first two bullet points on the list. If you need to store files, you are able to use S3 buckets or other cloud storage which works quite nicely.

I have found that cold starts are not really an issue with JavaScript code. But, if you are using Java, it may take a little time to start the JVM. To mitigate this you can figure out how long after the code has been idle your cloud provider shuts down resources. You can then set a timer to trigger within that time period so that cold starts won’t be an issue in a case that is affecting the application’s responsiveness.

The Serverless Framework

The tool I use to make developing serverless Node.js applications easy is the Serverless Framework. Serverless is an open source CLI that will take care of deploying your code to a variety of cloud providers.

There are also awesome plugins that improve development velocity by allowing you to run serverless applications locally. I will show later in this article how you can also use a plugin to run DynamoDB locally.

I have found this to be a very valuable tool that really simplifies the development process. Check out its website and documentation here.

Now We Build

Now that we’ve breezed over some background on serverless and the tools we will be using, let’s start building! I am going to demonstrate how to build a REST API Node.js service using Express and the Serverless Framework.

See Also:  AWS Lambda with Spring Boot

To get started you will need to install the Serverless CLI and log in.

npm install -g serverless
serverless login

It will also be helpful to check out the documentation for adding your AWS credentials.

Then we’ll use npm to initialize a new project in a new folder and then install a few npm packages as dependencies.

npm init -y
npm i --save aws-sdk body-parser express node-uuid serverless-http

Next step will be to create a serverless.yml file to tell serverless what resources we need and how to deploy our code.


service: lambda-rest-api

  tableName: 'todos-${self:provider.stage}'

  name: aws
  runtime: nodejs8.10
  stage: dev
  region: us-east-1
    - Effect: Allow
        - dynamodb:Query
        - dynamodb:Scan
        - dynamodb:GetItem
        - dynamodb:PutItem
        - dynamodb:UpdateItem
        - dynamodb:DeleteItem
        - { "Fn::GetAtt": ["TodosDynamoDBTable", "Arn" ] }
    TODOS_TABLE: ${self:custom.tableName}

    handler: index.handler
      - http: ANY /
      - http: 'ANY {proxy+}'

      Type: 'AWS::DynamoDB::Table'
            AttributeName: todoId
            AttributeType: S
            AttributeName: todoId
            KeyType: HASH
          ReadCapacityUnits: 1
          WriteCapacityUnits: 1
        TableName: ${self:custom.tableName}

There are a ew things to call out in our setup. Under custom, we set a variable for our DynamoDB table name. If you look a little below that in provider.environment you can see that value is set in TODOS_TABLE. This value we’ll see shortly will be exposed in process.env in our code.

The rest is mainly setting up specifics for using AWS as the cloud provider and using a DynamoDB table as a resource. The Serverless Framework provides templates for a variety of cloud providers and you can set up just about anything you would need infrastructure-wise in this setup file.

Now let’s add a couple of plugins to run the code locally.

npm i --save serverless-dynamodb-local serverless-offline

We also need to add these plugins to the serverless.yml file.


  - serverless-dynamodb-local
  - serverless-offline

The order is important here. serverless-dynamodb-local has to be before serverless-offline.

That takes care of setting the project up. Next, we’ll add an index.js and start writing our service while we test locally.

Create the file and add this code to get started:


const serverless = require('serverless-http');
const bodyParser = require('body-parser');
const express = require('express');
const app = express();
const AWS = require('aws-sdk');
const uuid = require('node-uuid');

const { TODOS_TABLE, IS_OFFLINE } = process.env;

const dynamoDb = IS_OFFLINE === 'true' ?
  new AWS.DynamoDB.DocumentClient({
    region: 'localhost',
    endpoint: 'http://localhost:8000',
  }) :
  new AWS.DynamoDB.DocumentClient();

app.use(bodyParser.json({ strict: false }));

app.get('/todos', (req, res) => {
  const params = {
    TableName: TODOS_TABLE,

  dynamoDb.scan(params, (error, result) => {
    if (error) {
      res.status(400).json({ error: 'Error retrieving Todos'});

    const { Items: todos } = result;

    res.json({ todos });

module.exports.handler = serverless(app);

If you’ve used Express to build Node.js services before, then this should look familiar. The plugins to run locally expose the IS_OFFLINE flag to indicate it is running locally rather than on AWS.

Our code now has a single endpoint for getting all of the todos out of the DynamoDB table.

Run the code using this command: sls offline start --migrate.

Now if you visit http://localhost:3000/todos, you should get a response with an empty array: {"todos":[]}

Let’s add a way to add todos to the database.

index.js'/todos', (req, res) => {
  const { title, done = false} = req.body;

  const todoId = uuid.v4();

  const params = {
    TableName: TODOS_TABLE,
    Item: {

  dynamoDb.put(params, (error) => {
    if (error) {
      console.log('Error creating Todo: ', error);
      res.status(400).json({ error: 'Could not create Todo' });

    res.json({ todoId, title, done });

Now that the code is in place, we can use curl to add a new item.

curl -H "Content-Type: application/json" -X POST http://localhost:3000/todos -d '{"title": "Finish bug tickets"}'

This will create an ID for our new todo and give us back the newly created item:

{"todoId":"5c30e169-26e3-44de-9564-d23a403ddf1b","title":"Finish bug tickets","done":false}

If we go back to our first end-point, we will get back an array with our newly created item in it. Let’s add an end-point to get back a single todo using the ID.

See Also:  Core ML After Dark


app.get('/todos/:todoId', (req, res) => {
  const { todoId } = req.params;

  const params = {
    TableName: TODOS_TABLE,
    Key: {

  dynamoDb.get(params, (error, result) => {
    if (error) {
      res.status(400).json({ error: 'Error retrieving Todo' });

    if (result.Item) {
      const { todoId , title, done } = result.Item;
      res.json({ todoId, title, done });
    } else {
      res.status(404).json({ error: `Todo with id: ${todoId} not found` });

Now if we visit this new end-point (http://localhost:3000/todos/5c30e169-26e3-44de-9564-d23a403ddf1b) using the ID from before, we should get this todo back as a result:

{"todoId":"5c30e169-26e3-44de-9564-d23a403ddf1b","title":"Finish bug tickets","done":false}

The next thing to do is give a way to update existing todos and mark them as done. Let’s add that PUT end-point now.


app.put('/todos', (req, res) => {
  const { todoId, title, done } = req.body;

  var params = {
    TableName : TODOS_TABLE,
    Key: { todoId },
    UpdateExpression : 'set #a = :title, #b = :done',
    ExpressionAttributeNames: { '#a' : 'title', '#b': 'done' },
    ExpressionAttributeValues : { ':title' : title, ':done': done },

  dynamoDb.update(params, (error) => {
    if (error) {
      console.log(`Error updating Todo with id ${todoId}: `, error);
      res.status(400).json({ error: 'Could not update Todo' });

    res.json({ todoId, title, done });

We can test this out with curl.

curl -H "Content-Type: application/json" -X PUT http://localhost:3000/todos -d '{"todoId": "5c30e169-26e3-44de-9564-d23a403ddf1b", "title": "Finish bug tickets", "done": "false"}'

And that takes care of the steps for create, read, and update. The last thing we need to add is the ability to delete.


app.delete('/todos/:todoId', (req, res) => {
  const { todoId } = req.params;

  const params = {
    TableName: TODOS_TABLE,
    Key: {

  dynamoDb.delete(params, (error) => {
    if (error) {
      console.log(`Error updating Todo with id ${todoId}`, error);
      res.status(400).json({ error: 'Could not delete Todo' });

    res.json({ success: true });

And once again we test our new code with curl.

curl -H "Content-Type: application/json" -X DELETE http://localhost:3000/todos/5c30e169-26e3-44de-9564-d23a403ddf1b

That takes care of our code. Now that we have developed and tested everything locally, we are ready to deploy to AWS.

This can be done with just a single command: sls deploy. This will take a little time, but at the end, serverless will give the end-point where your code is now running.

Matthews-Air:lambda-rest-api matthewbrown$ sls deploy
Serverless: Packaging service...
Serverless: Excluding development dependencies...
Serverless: Creating Stack...
Serverless: Checking Stack create progress...
Serverless: Stack create finished...
Serverless: Uploading CloudFormation file to S3...
Serverless: Uploading artifacts...
Serverless: Uploading service .zip file to S3 (28.58 MB)...
Serverless: Validating template...
Serverless: Updating Stack...
Serverless: Checking Stack update progress...
Serverless: Stack update finished...
Service Information
service: lambda-rest-api
stage: dev
region: us-east-1
stack: lambda-rest-api-dev
api keys:
ANY -{proxy+}
todo-app: lambda-rest-api-dev-todo-app

We have already tested all this code. But, just to confirm that everything is working on AWS, we can re-run all of our curl commands using this newly-created URL.

Also for tracking down errors, this command is used to tail the logs: serverless logs -f todo-app -t

To clean up all the resources we’ve started in this tutorial, simply run the following command serverless remove.

Wrap Up

We’ve accomplished quite a bit here. We built and deployed a simple CRUD service to the cloud and it’s now publicly available. I hope this has demonstrated what a powerful tool serverless and the Serverless Framework are for developers. This is really just scratching the surface of what is possible.

Thanks for reading and check out the source code on Github.

What Do You Think?