DynamoDB basic CRUD with Node.js on AWS Lambda

I think that serverless is very big thing in web development today and here I will try to explain how to build simple Node.js AWS Lambda function with basic CRUD capabilities on DynamoDB ( Amazon DynamoDB is a fully managed NoSQL database ) table and S3 bucket for file uploads. If you wanna learn DynamoDB visit this great guide on DynamoDB.
First, we need to log in to our AWS Console and select DynamoDB from the list of services. Next we need to create new DynamoDB table. Table name will be “myposts” and Primary key will be userId (also here we will set our Sort key … in our example let’s call it postId )
Notice: Each DynamoDB table has a primary key, which cannot be changed once set. DynamoDB supports two different kinds of primary keys:
- Partition key
- Partition key and sort key (composite)
We are going to use the composite primary key which gives us additional flexibility when querying the data. For example, if you provide only the value for userId
, DynamoDB would retrieve all of the posts by that user. Or you could provide a value for userId
and a value for postId
, to retrieve a particular post.
To further your understanding of how indexes work in DynamoDB, you can read more here: DynamoDB Core Components
Now click Create and your table “myposts” is created.
Next we will create S3 bucket for file uploads. We need to handle file uploads because each post in our example can have an uploaded file as an attachment. Creating bucket is very simple and easy from AWS console, just make sure you pick unique bucket name (in my case I pick chrisvzpostsexample ) and region where the files will be stored. Step through the next steps and leave the defaults by clicking Next, and then click Create Bucket on the last step. After we create our bucket we need to enable CORS ( By default, S3 does not allow its resources to be accessed from a different domain. )
Select the Permissions tab, then select CORS configuration in our bucket. Add this to CORS Configuration and then Save. In our example I allow all domains to access my bucket (<AllowedOrigin>*</AllowedOrigin> ) but you can restrict this to your domain or list of domains.
<CORSConfiguration> <CORSRule> <AllowedOrigin>*</AllowedOrigin> <AllowedMethod>GET</AllowedMethod> <AllowedMethod>PUT</AllowedMethod> <AllowedMethod>POST</AllowedMethod> <AllowedMethod>HEAD</AllowedMethod> <AllowedMethod>DELETE</AllowedMethod> <MaxAgeSeconds>3000</MaxAgeSeconds> <AllowedHeader>*</AllowedHeader> </CORSRule> </CORSConfiguration>
Now that our S3 bucket is ready, let’s get set up to handle user authentication.
Our myposts app needs to handle user accounts and authentication in a secure and reliable way. We don’t want anonymous users to create, update, read or delete posts. To do this we are going to use Amazon Cognito. We need to create a new user pool inside AWS Cognito. Pool name is myposts-user-pool. In choose username attributes select Email address or phone numbers and Allow email addresses. This is telling Cognito User Pool that we want our users to be able to sign up and login with their email as their username. Then hit Create Pool on bottom.
When user pool is created we need to add App Client to it. In left sidebar menu select Add an app client.
Enter App client name ( in my case … myposts-app ), un-select Generate client secret, select Enable sign-in API for server-based authentication, then select Create app client.
- Generate client secret: user pool apps with a client secret are not supported by the JavaScript SDK. We need to un-select the option.
- Enable sign-in API for server-based authentication: required by AWS CLI when managing the pool users via command line interface. We will be creating a test user through the command line interface soon.
Save all with Create App Client. Finally, select Domain name from the left sidebar menu. Enter unique domain name and select Save changes. In my case I use chrisvzmyposts-app
.
Next I need to check if I can create user inside my user pool. I will use AWS CLI to sign in a new user and also confirm new user. Run in your terminal next command but change YOUR_COGNITO_REGION and YOUR_COGNITO_APP_CLIENT_ID to yours from AWS. Also if you want you can change username and password to some real user.
$ aws cognito-idp sign-up \ --region YOUR_COGNITO_REGION \ --client-id YOUR_COGNITO_APP_CLIENT_ID \ --username admin@mypostsexample.com \ --password Passw0rd!
Now if everything is ok, this account needs to be verified…
$ aws cognito-idp admin-confirm-sign-up \ --region YOUR_COGNITO_REGION \ --user-pool-id YOUR_COGNITO_USER_POOL_ID \ --username admin@mypostsexample.com
Now our test user is ready. Next, let’s set up the Serverless Framework to create our backend APIs. In our example we are going to be using AWS Lambda and Amazon API Gateway to create our backend part, and we are going to use Serverless Framework to help us with it ( *The Serverless Framework enables developers to deploy backend applications as independent functions that will be deployed to AWS Lambda. It also configures AWS Lambda to run your code in response to HTTP requests using Amazon API Gateway. )
If you don’t have yet, you need to set up the Serverless Framework on your local development environment, so run in terminal…
$ npm install serverless -g
In your working directory; create a project using a Node.js starter. We will be using starter project from https://github.com/AnomalyInnovations
$ serverless install --url https://github.com/AnomalyInnovations/serverless-nodejs-starter --name myposts-app-api
Go into the directory for our backend api project.
cd myposts-app-api
Now the directory should contain a few files including, the handler.js and serverless.yml.
- handler.js file contains actual code for the services/functions that will be deployed to AWS Lambda.
- serverless.yml file contains the configuration on what AWS services Serverless will provision and how to configure them.
We also have a tests/
directory where we can add our unit tests. Next we run in our root these commands…
$ npm install $ npm install aws-sdk --save-dev $ npm install uuid --save
- aws-sdk allows us to talk to the various AWS services.
- uuid generates unique ids. We need this for storing things to DynamoDB.
Let’s examine what I put in our serverless.yml file. In my example content of this file is as follows. There are comments in file that explains parts.
# NOTE: update this with your service name service: myposts-app-api # Use the serverless-webpack plugin to transpile ES6 plugins: - serverless-webpack - serverless-offline # serverless-webpack configuration # Enable auto-packing of external modules custom: webpack: webpackConfig: ./webpack.config.js includeModules: true provider: name: aws runtime: nodejs8.10 stage: prod region: eu-central-1 # To load environment variables externally # rename env.example to env.yml and uncomment # the following line. Also, make sure to not # commit your env.yml. # #environment: ${file(env.yml):${self:provider.stage}} # 'iamRoleStatements' defines the permission policy for the Lambda function. # In this case Lambda functions are granted with permissions to access DynamoDB. iamRoleStatements: - Effect: Allow Action: - dynamodb:DescribeTable - dynamodb:Query - dynamodb:Scan - dynamodb:GetItem - dynamodb:PutItem - dynamodb:UpdateItem - dynamodb:DeleteItem Resource: "arn:aws:dynamodb:eu-central-1:*:*" functions: # Defines an HTTP API endpoint that calls the main function in create.js # - path: url path is /myposts # - method: POST request # - cors: enabled CORS (Cross-Origin Resource Sharing) for browser cross # domain api call # - authorizer: authenticate using the AWS IAM role create: handler: create.main events: - http: path: myposts method: post cors: true authorizer: aws_iam get: # Defines an HTTP API endpoint that calls the main function in get.js # - path: url path is /myposts/{id} # - method: GET request handler: get.main events: - http: path: myposts/{id} method: get cors: true authorizer: aws_iam list: # Defines an HTTP API endpoint that calls the main function in list.js # - path: url path is /myposts # - method: GET request handler: list.main events: - http: path: myposts method: get cors: true authorizer: aws_iam update: # Defines an HTTP API endpoint that calls the main function in update.js # - path: url path is /myposts/{id} # - method: PUT request handler: update.main events: - http: path: myposts/{id} method: put cors: true authorizer: aws_iam delete: # Defines an HTTP API endpoint that calls the main function in delete.js # - path: url path is /myposts/{id} # - method: DELETE request handler: delete.main events: - http: path: myposts/{id} method: delete cors: true authorizer: aws_iam
Service name is myposts-app-api
. Serverless Framework creates your stack on AWS using this as the name. You’ll notice the serverless-webpack
plugin that is included. I also have a webpack.config.js
that configures the plugin. This code is inside webpack.config.js
const slsw = require("serverless-webpack"); const nodeExternals = require("webpack-node-externals"); module.exports = { entry: slsw.lib.entries, target: "node", // Generate sourcemaps for proper error messages devtool: 'source-map', // Since 'aws-sdk' is not compatible with webpack, // we exclude all node dependencies externals: [nodeExternals()], mode: slsw.lib.webpack.isLocal ? "development" : "production", optimization: { // We do not want to minimize our code. minimize: false }, performance: { // Turn off size warnings for entry points hints: false }, // Run babel on all .js files and skip those in node_modules module: { rules: [ { test: /\.js$/, loader: "babel-loader", include: __dirname, exclude: /node_modules/ } ] } };
The main part of this config is the entry
attribute that is automatically generating using the slsw.lib.entries
that is a part of the serverless-webpack
plugin. This automatically picks up all my handler functions and packages them. I also use the babel-loader
on each of these to transpile my code. One other thing to note here is that I’m not using nodeExternals
because I do not want Webpack to bundle my aws-sdk
module – it is not compatible with Webpack.
Finally our .babelrc
file looks like this.
{ "plugins": ["source-map-support", "transform-runtime"], "presets": [ ["env", { "node": "8.10" }], "stage-3" ] }
Here we are telling Babel to transpile our code to target Node v8.10.
Now let’s go with creating our API. In our project root, create a libs/
directory.
$ mkdir libs $ cd libs
And then create a libs/response-lib.js
file.
export function success(body) { return buildResponse(200, body); } export function failure(body) { return buildResponse(500, body); } function buildResponse(statusCode, body) { return { statusCode: statusCode, headers: { "Access-Control-Allow-Origin": "*", "Access-Control-Allow-Credentials": true }, body: JSON.stringify(body) }; }
This will manage building the response objects for both success and failure cases with the proper HTTP status code and headers.
Also, inside libs/
, create a dynamodb-lib.js
file.
import AWS from "aws-sdk"; export function call(action, params) { const dynamoDb = new AWS.DynamoDB.DocumentClient(); return dynamoDb[action](params).promise(); }
Here we are using the promise form of the DynamoDB methods. Promises are a method for managing asynchronous code that serve as an alternative to the standard callback function syntax.
First CRUD function that I create is create.js in root of my app.
import uuid from "uuid"; import * as dynamoDbLib from "./libs/dynamodb-lib"; import { success, failure } from "./libs/response-lib"; export async function main(event, context) { const data = JSON.parse(event.body); const params = { TableName: "myposts", Item: { userId: event.requestContext.identity.cognitoIdentityId, noteId: uuid.v1(), content: data.content, attachment: data.attachment, createdAt: Date.now() } }; try { await dynamoDbLib.call("put", params); return success(params.Item); } catch (e) { return failure({ status: false }); } }
We use the async/await
pattern here. This allow me to return once request is finished instead doing callback functions.
Now we have to add 4 more files, get.js, delete.js, update.js and list.js.
get.js – return individual post for user
import * as dynamoDbLib from "./libs/dynamodb-lib"; import { success, failure } from "./libs/response-lib"; export async function main(event, context) { const params = { TableName: "myposts", // 'Key' defines the partition key and sort key of the item to be retrieved // - 'userId': Identity Pool identity id of the authenticated user // - 'postId': path parameter Key: { userId: event.requestContext.identity.cognitoIdentityId, postId: event.pathParameters.id } }; try { const result = await dynamoDbLib.call("get", params); if (result.Item) { // Return the retrieved item return success(result.Item); } else { return failure({ status: false, error: "Post not found." }); } } catch (e) { console.log(e); return failure({ status: false }); } }
delete.js – delete post
import * as dynamoDbLib from "./libs/dynamodb-lib"; import { success, failure } from "./libs/response-lib"; export async function main(event, context) { const params = { TableName: "myposts", // 'Key' defines the partition key and sort key of the item to be removed // - 'userId': Identity Pool identity id of the authenticated user // - 'postId': path parameter Key: { userId: event.requestContext.identity.cognitoIdentityId, postId: event.pathParameters.id } }; try { const result = await dynamoDbLib.call("delete", params); return success({ status: true }); } catch (e) { return failure({ status: false }); } }
list.js – all posts for user
import * as dynamoDbLib from "./libs/dynamodb-lib"; import { success, failure } from "./libs/response-lib"; export async function main(event, context) { const params = { TableName: "myposts", // 'KeyConditionExpression' defines the condition for the query // - 'userId = :userId': only return items with matching 'userId' // partition key // 'ExpressionAttributeValues' defines the value in the condition // - ':userId': defines 'userId' to be Identity Pool identity id // of the authenticated user KeyConditionExpression: "userId = :userId", ExpressionAttributeValues: { ":userId": event.requestContext.identity.cognitoIdentityId } }; try { const result = await dynamoDbLib.call("query", params); // Return the matching list of items in response body return success(result.Items); } catch (e) { return failure({ status: false }); } }
update.js – update post
import * as dynamoDbLib from "./libs/dynamodb-lib"; import { success, failure } from "./libs/response-lib"; export async function main(event, context) { const data = JSON.parse(event.body); const params = { TableName: "myposts", // 'Key' defines the partition key and sort key of the item to be updated // - 'userId': Identity Pool identity id of the authenticated user // - 'postId': path parameter Key: { userId: event.requestContext.identity.cognitoIdentityId, postId: event.pathParameters.id }, // 'UpdateExpression' defines the attributes to be updated // 'ExpressionAttributeValues' defines the value in the update expression UpdateExpression: "SET content = :content, attachment = :attachment", ExpressionAttributeValues: { ":attachment": data.attachment || null, ":content": data.content || null }, // 'ReturnValues' specifies if and how to return the item's attributes, // where ALL_NEW returns all attributes of the item after the update; you // can inspect 'result' below to see how it works with different settings ReturnValues: "ALL_NEW" }; try { const result = await dynamoDbLib.call("update", params); return success({ status: true }); } catch (e) { return failure({ status: false }); } }
And now our simple app is finished and we can deploy it to AWS. Run..
$ serverless deploy
If you have multiple profiles for your AWS SDK credentials, you will need to explicitly pick one. Use the following command instead:
$ serverless deploy --aws-profile myProfile
Where myProfile
is the name of the AWS profile you want to use.
In your terminal after few minutes you will see something similar to this:
Kristijans-iMac:myposts-app-api kristijanklepac$ serverless deploy Serverless: Bundling with Webpack... Time: 1977ms Built at: 11/02/2018 1:05:08 PM Asset Size Chunks Chunk Names create.js 7.88 KiB 0 [emitted] create get.js 8.16 KiB 1 [emitted] get list.js 8.04 KiB 2 [emitted] list update.js 8.58 KiB 3 [emitted] update delete.js 7.85 KiB 4 [emitted] delete create.js.map 7.04 KiB 0 [emitted] create get.js.map 7.2 KiB 1 [emitted] get list.js.map 7.12 KiB 2 [emitted] list update.js.map 7.83 KiB 3 [emitted] update delete.js.map 6.88 KiB 4 [emitted] delete Entrypoint create = create.js create.js.map Entrypoint get = get.js get.js.map Entrypoint list = list.js list.js.map Entrypoint update = update.js update.js.map Entrypoint delete = delete.js delete.js.map [0] external "source-map-support/register" 42 bytes {0} {1} {2} {3} {4} [built] [1] external "babel-runtime/regenerator" 42 bytes {0} {1} {2} {3} {4} [built] [2] external "babel-runtime/helpers/asyncToGenerator" 42 bytes {0} {1} {2} {3} {4} [built] [3] ./libs/dynamodb-lib.js 468 bytes {0} {1} {2} {3} {4} [built] [4] external "aws-sdk" 42 bytes {0} {1} {2} {3} {4} [built] [5] ./libs/response-lib.js 762 bytes {0} {1} {2} {3} {4} [built] [6] external "babel-runtime/core-js/json/stringify" 42 bytes {0} {1} {2} {3} {4} [built] [7] ./create.js 2.32 KiB {0} [built] [8] external "uuid" 42 bytes {0} [built] [9] ./get.js 2.68 KiB {1} [built] [10] ./list.js 2.55 KiB {2} [built] [11] ./update.js 3.08 KiB {3} [built] [12] ./delete.js 2.34 KiB {4} [built] Serverless: Package lock found - Using locked versions Serverless: Packing external modules: source-map-support@^0.4.18, babel-runtime@^6.26.0, uuid@^3.3.2 Serverless: Packaging service... Serverless: Creating Stack... Serverless: Checking Stack create progress... ..... Serverless: Stack create finished... Serverless: Uploading CloudFormation file to S3... Serverless: Uploading artifacts... Serverless: Uploading service .zip file to S3 (1.41 MB)... Serverless: Validating template... Serverless: Updating Stack... Serverless: Checking Stack update progress... ................................................................................................... Serverless: Stack update finished... Service Information service: myposts-app-api stage: prod region: eu-central-1 stack: myposts-app-api-prod api keys: None endpoints: POST - https://********.execute-api.eu-central-1.amazonaws.com/prod/myposts GET - https://********.execute-api.eu-central-1.amazonaws.com/prod/myposts/{id} GET - https://********.execute-api.eu-central-1.amazonaws.com/prod/myposts PUT - https://********.execute-api.eu-central-1.amazonaws.com/prod/myposts/{id} DELETE - https://********.execute-api.eu-central-1.amazonaws.com/prod/myposts/{id} functions: create: myposts-app-api-prod-create get: myposts-app-api-prod-get list: myposts-app-api-prod-list update: myposts-app-api-prod-update delete: myposts-app-api-prod-delete Kristijans-iMac:myposts-app-api kristijanklepac$
After sucessfull deploy we have our POST, GET, PUT and DELETE API endpoints available and we can CRUD our table in DynamoDB database.
Thanks for reading…
Tags: .NET, AWS, AWS Lambda, DynamoDB