Main Page
 The gatekeeper of reality is
 quantified imagination.

Stay notified when site changes by adding your email address:

Your Email:

Bookmark and Share
Email Notification
Project Node.js Lambda & DynamoDB
Purpose
The purpose of this project is to show a way to get your AWS Lambda (Node.js) code to communicate with a DynamoDB table along with supporting multiple DynamoDB queries in an event or synchronized basis. This page will also show a series of form validation functions that work in Lambda node.js, along with a function to generate a random representation of a UUID number sequence; in SQL-speak a UUID is typically used to represent a unique value for a primary key for a row of data since DynamoDB has no facility with which to generate a UUID number itself.


(Enlarge)
  1. If you have not already established a role and policy for your Lambda function to operate under, you should create a policy and a role under the IAM interface of AWS. In this example, we have a fairly common policy that will allow Lambda to perform some functions associated with DynamoDB. The logs entries allows the Lambda function to write log entries to CloudWatch (an area you can go to examine output, errors and so on).

(Enlarge)
  1. When you create a new Lambda function you will have the opportunity to specify the role that you just created that the Lambda function will use for its permissions.

(Enlarge)
  1. This shows a simple block of Lambda node.js code, demonstrating how to make a call to DynamoDB and wait for DynamoDB to respond before exiting out (in Node.js everything is asynchronous...so getting behavior that is "synchronous" is challenging when starting out). The key is, rather than have any DynamoDB function outside your main function, is to embed it INSIDE your main function as shown (there is a way to recursively call functions outside the main function which I'll show later, but if just starting out, nesting may be the easiest). This code block also shows some of the wrong ways of trying to "force" node.js to "wait" for response along with comments.
  2. You can take a closer look at the code block as a simple text file here

(Enlarge)
  1. As you can see there is not too much to making a "synchronous" DyanmoDB call in your Lambda node.js code.
  2. In this somewhat larger example a way is shown in which you can "nest" multiple DynamoDB calls in your main function where you may employ some sort of conditional logic flow (I'll show later how to lighten up some of the nesting with recursively calling external functions).
  3. IMPORTANT NOTE: It is important to note that with the "scan" command, the maximum volume of data it will crawl in your table is limited to a rather paultry 1MB, whether or not there are any matches to the scan and whether or not the whole content of a table has been transversed - which means you have to make multiple calls if your table is larger than 1MB (strange to have such a HUGE draconian limitation on a database, especially since we are no longer in the 1990's). So you'll need to plan accordingly; as previously stated, when the table is larger than 1MB you'll have to track additional parameters to make repeated scan calls to DynamoDB to transverse the table.
  4. Speaking of 1MB limit, that reminds me of the statement about Windows and users never needing more than 640kb of space. So grab your 80's sneakers when you need to work with scan.
  5. NOTE: If you are looking to avoid deep nesting, a more recent alternative is to utilize functions like "InvokeAsync" which is now just "invoke". In this case you would want to use syntax "var lambda = new AWS.Lambda({ apiVersion: '2015-03-31' });" to get your function to run via the use of a function call such as "lambda.invoke(params, function...".
 
  1. OTHER USEFUL FUNCTIONS
  2. At some point incoming data may have to be validated before further action is taken, or some functions may be needed to create a timestamp in the format of mySQL or generate a random representation of a UUID number (commonly used as a unique primary key in DynamoDB since DynamoDB has no ability to do that simple task itself).
  3. Here is a set of example functions for validation, timestamping and UUID creation. This is also provided as a simple text file.


Notable Typeof/Eval Behavior with "getItem" of AWS.DynamoDB.DocumentClient()
 
  1. Let's say you are looking for a DynamoDB row of data with a unique key with the value of 1 and that row of data does not exist - perhaps it was deleted. The query request with "getItem" will not fail, which is normal behavior for the request itself. But, if you attempt to request a specific column (which does not exist) you will be greeted by the message "cannot read property of undefined". That response is to be expected. However, where it gets interesting is when you realize you cannot test for undefined on the column because the attempt to test via "typeof" results in an undefined error itself.
  2. For example, the following statement would generate an error by merely testing for undefined via "typeof":
    if (typeof data.Item.myid.N === 'undefined') { ...error before getting here... }
    The following does not return an error but does not catch the problem at hand:
    if (typeof data.Item.myid === 'undefined') { ...we never get here but does not error out... }
  3. How do you test to see if something exists or not when the mere action of testing (using long standard javascript typeof or evaluating for null) results in an error?
    The ONLY solution seems to be that of going OLD SCHOOL and using a try catch block. Impressive, isn't it? Here's the solution:
    try { data = data.Item.myid.N; } catch(error) { data = 0; }


Convert RDS/AuroraDB Dates From UTC to LocalTime
 
  1. Toward the end of 2016 I discovered a rather interesting issue with AWS Lambda/Node.js. This issue pertained to pulling a date/timestamp OUT of an RDS table which was stored as local time (as in "2015-03-18T11:02:46.000Z"), and having Lambda/Node.js automatically convert the local time to GMT/UTC time (as in "Wed Mar 18 2015 11:02:46 GMT +0000 (UTC)"). I was impressed with how Lambda/Node.js knew what I wanted more than I did *sarcasm*. And what is more interesting is, depending on what AWS Lamda/Node release you are using, sometimes the date auto-conversion does not take place.
  2. What was unfortunate about that "auto-conversion" was that it BROKE other functions because, wouldn't you know it, the date/timestamp was no longer the standard MySQL format they were expecting. After some failed searches to find a converter back to local time (keep in mind, using AWS Lambda/Node.js means you do NOT have the ability to change assorted command-line options - like auto-converting dates) it was evident that a converter to convert the auto-converted date would have to be written.
  3. Here is a function to convert the auto-converted GMT/UTC timestamp back to RDS/AuroraDB (MySQL), should you find yourself in such a situation:
    /* Conversion: Convert GMT/UTC to MySQL Localtime */
    exports.convertUTCtoLocal = function(convdate) {
    	/* Example conversion: "Wed Mar 18 2015 11:02:46 GMT +0000 (UTC)" TO "2015-03-18T11:02:46.000Z" */
    	var result = "";
    	var tmp_rawdate = convdate.toString();
    	if (tmp_rawdate.indexOf("UTC") == -1) {
    		/* The date does not appear to be in GMT/UTC time, so return it */
    		result = tmp_rawdate;
    	}
    	else {
    		/* Convert GMT/UTC to Localtime */
    		dtchunks = [];
    		dtchunks = tmp_rawdate.toString().split(" ");
    		var dt_year = dtchunks[3];
    		var dt_day = "";
    		if (Number(dtchunks[2]) < 10) { dt_day = "0" + dtchunks[2].toString(); } else { dt_day = dtchunks[2].toString(); }
    		var dt_stamp = dtchunks[4].toString();
    		var dt_month_raw = dtchunks[1].toLowerCase();
    		var dt_month = "";
    		if (dt_month_raw === "jan") { dt_month = "01"; }
    		if (dt_month_raw === "feb") { dt_month = "02"; }
    		if (dt_month_raw === "mar") { dt_month = "03"; }
    		if (dt_month_raw === "apr") { dt_month = "04"; }
    		if (dt_month_raw === "may") { dt_month = "05"; }
    		if (dt_month_raw === "jun") { dt_month = "06"; }
    		if (dt_month_raw === "jul") { dt_month = "07"; }
    		if (dt_month_raw === "aug") { dt_month = "08"; }
    		if (dt_month_raw === "sep") { dt_month = "09"; }
    		if (dt_month_raw === "oct") { dt_month = "10"; }
    		if (dt_month_raw === "nov") { dt_month = "11"; }
    		if (dt_month_raw === "dec") { dt_month = "12"; }
    		result = dt_year + "-" + dt_month + "-" + dt_day + "T" + dt_stamp + ".000Z";
    	}
    return result;
    };
    


*UPDATE* Dealing With Imposed UI Limitations as of 7/2016

(Enlarge)
  1. Through the "grape vine" I heard that Amazon AWS / Lambda had apparently "gimped" the Lambda UI. As you may be aware the Lambda UI allows you to edit code inline. A pretty handy feature when you are not doing anything really complex.
  2. Apparently (and I managed to verify it) someone decided that the maximum number of characters that you could "edit" or "add" in the editor was approximately 51,000 characters. I'd guess around the equivalent size of a blog post.
  3. As part of the tact of whomever imposed that new limitation, I went looking for updated documentation that would let clients know of the new limitation but could not find any as of 7/12/2016. In fact, in fiddling around with their inline editor there was no visual que that would tell someone they were reaching that cheesy character limit. If you are going to treat programming code as if it were a blog post wouldn't you want to, at least, have a remaining character counter displayed rather than surprise the client when they would try to submit the code they were working on? Evidently that's a tough decision.
  4. Perhaps Amazon AWS had adopted a new design and release paradigm where it is their mission to surprise and place limitations on paying clients in the production environment?
  5. That kinda makes me leary of what they may do in the future with other products / services they offer. For example, if you have a bunch of inline code that happen to exceed approximately 51,000 characters in production and you need to edit something you are effectively "effed" unless you want to go the packaging route, which makes the inline editor a useless appendage for Lambda. Who knows, maybe the inline editor is too complex for AWS Lambda to support, so throwing out fresh limitations will allow them to aggro clients sufficiently that they will not use the "feature" any longer or find a different cloud solution...and, once done, it no longer needs to be supported. Brilliant? I can't say; I've always been more of the camp that works for clients, not the other way around.
  6. Something else that you may want to plan for is when you have a code that exceeds approximately 51,000 characters (which you've uploaded instead of working with "inline" because it won't save "inline"), if you need to change the operational characteristics of the Lambda function (such as to allocate more memory to it), the UI will NOT allow you to save that configuration change until you delete all of the code shown in the "inline" window (perhaps replace with a colorful comment). But once you have successfully updated the function you can then turn around and re-upload the code you originally had in place to make the Lambda function "function" again.


Ghosting with DynamoDB and BatchWriteItem
INTERESTING SOLUTION
  1. As of this writing DynamoDB has no support for "BatchUpdateItem" (to update multiple items in a single request), yet it seems like quite a useful function to have at your disposal like "BatchWriteItem" that will delete or insert up to 25 items in a single request. Without using 3rd party libraries, and sticking to existing DynamoDB functions along with using standard Node.js...what is a solution to having your own "BatchUpdateItem"? That is what I refer to as "Ghosting with DynamoDB and BatchWriteItem" and you can view a working example here.


Delete Multiple DynamoDB Items at the Same Time

(Enlarge)
  1. Another good thing to know how to do is being able to delete multiple items from a DynamoDB table at once. Batchwriteitem will delete up to 25 items per call. The details of doing a batch item deletion is shown by the screenshot to the left.
  2. This example shows how to do a batch delete in a more dynamic fashion; usually good to know when you don't know how many items you may be deleting in a call, or when you may be deleting multiple items out of multiple tables.


Insert Multiple DynamoDB Items at the Same Time

(Enlarge)
  1. Inserting multiple items into a DynamoDB table at once can be accomplished with (like the delete action). While you can insert multiple times into a single table or multiple tables there are some limitations, like there is with the delete option. See the example to the left and also here.


Inserting Single Data Items into DynamoDB
 
  1. When adding a new row of data into DynamoDB, or updating an existing row in DynamoDB (when updating you have to resubmit the ENTIRE row since DynamoDB does not allow you to update a single column in an existing row) then you need to use put. The syntax below provides an example of putting a row of data:
var AWS = require("aws-sdk");
var docClient = new AWS.DynamoDB.DocumentClient();
var params = {};
params = {
	TableName: "NameOfTable",
	Item: {
		ID: 123,
		AddedBy: "pancakes",
		Info: "Some sort of text to be added."
	}
};
docClient.put(params, function(err, data) {
	if (err) {
		/* Generate Response */
		context.done(
			null,
			{
				"Error": JSON.stringify(err, null, 2)
			}
		);
	}
	else {
		/* Generate Response */
		context.done(
			null,
			{
				"Success": "You have putted, but not a golf put."
			}
		);
	}
});


Saving a list of strings (one dimensional array of values) to DynamoDB with DocumentClient
With AWS.DynamoDB.DocumentClient() in AWS Lambda you can save a list (aka array) of values (like strings) without specifying anything unique. For example, let's assume you need to save an array of values like some_array[0] = "stringa", some_array[1] = "stringb". In AWS speak that is a list; in this case a list of strings that take the form [{"s": "stringa"}, {"s": "stringb"}] and is saved in the Dynamo column labeled "ServiceSummary":
 
  1. STEP 1: Read a row in DynamoDB which contains a list (aka array) of strings of the form [{"s": "stringa"}, {"s": "stringb"}]:
ServiceSummary_raw = [];
data.Items.forEach(function(row) {
ServiceSummary_raw = row.ServiceSummary;
});
 
  1. STEP 2: You perform some sort of update to one of the strings in the list (in this case editing the second string):
ServiceSummary_new = [];
for (var x = 0; x < ServiceSummary_raw.length; x++) {
	var blurbtext = ServiceSummary_raw[x];
	if (x === 1) {
		blurbtext = "[NOTE] " + blurbtext;
	}
	/* Save the strings to the new list (aka array) which contains the change */
	/* It is implied each string value would take the form of {"s": "string text"}.  I've added toString() for clarity. */
	ServiceSummary_new[ServiceSummary_new.length] = blurbtext.toString();
}
 
  1. STEP 3: Save the changed string in the list (aka array) of strings. Notice that nothing unique is specified:
params = {
	TableName: "MyDynamoTable",
	Item: {
		"RowID": Some-row-id-in-table,
		"ServiceSummary": ServiceSummary_new
	}
};
docClient.put(params, function(err, data) {
	if (err) {
		/* Error occurred */
	}
	else {
		/* Execution completed okay */
	}
});


Query Multiple DynamoDB Tables at the Same Time with BatchGetItem

(Enlarge)
  1. Under traditional SQL you have the ability to JOIN tables and perform a variety of evaluative logic at the same time. Under DynamoDB you don't have the ability to perform the same activity per se...what you can do, however, is "join" multiple DynamoDB tables with a value of a column in each table and the column does not have to exist in all the tables. This "join" is accomplished by the function BatchGetitem as detailed here. This function is not as powerful as traditional SQL join capability, but should be adequate for many activities surrounding getting data from multiple tables.
  2. The example to the left demonstrates calling BatchGetItem under Node.js and Lambda for a single table using multiple keys (match criteria).
  3. You can download the example lambda code here.

(Enlarge)
  1. The example to the left demonstrates calling BatchGetItem under Node.js and Lambda for multiple tables and using keys (match criteria) for each table.
  2. You can download the example lambda code here.


Dealing With an Incoming Parameter To a Lambda function (Node variant) Whose Integer Value is 0
 
  1. When you have a parameter being passed to your AWS Lambda function running Node.js in inline mode and you try to check if it has a value you could do something like this:
    if (event.myparameter) ...
  2. If "myparameter" has a negative or positive value (like -1, 1) the if statement will evaluate as TRUE. If "myparameter", however, has a neutral value of 0 (remember from math 101 an integer is any whole number that is positive, negative or zero), the if statement will instantly have amnesia. That is, the following would return FALSE when "myparameter" is 0:
    if (event.myparameter) ...
  3. Interestingly, should you do a dump of event.myparameter it will show a value of 0. It's all shenanigans I tell you. So how do you overcome such a quandry? The remedy is to evaluate "myparameter" like this:
    if (typeof event.myparameter != 'undefined') ...
  4. At that point you'll know you definitely have "myparameter" being passed in, and it will have some sort of value. Even 0.


Simple Recursion with Limit, LastEvaluatedKey, and ExclusiveStartKey
 
  1. With the 1MB limit feature of DynamoDB, you cannot perform a simple scan of a table in a single call (as you can in SQL) if the size of the table is larger than 1MB; given the size of data nowadays such a limit seems like a handicap. The limitation requires that you add additional complexity to your code by having a mechanism that will repeatedly call a function to scan through 1MB chunks. This example focuses on the use of Limit, LastEvaluatedKey and ExclusiveStartKey.
  2. You can download the example lambda code here.


"Complex" Recursion to scan an entire DynamoDB table in 1MB chunks
 
  1. With the 1MB limit feature of DynamoDB, you cannot perform a simple scan of a table in a single call (as you can in SQL) if the size of the table is larger than 1MB; given the size of data nowadays such a limit seems like a handicap. The limitation requires that you add additional complexity to your code by having a mechanism that will repeatedly call a function to scan through 1MB chunks. This example focuses on showing exactly how to setup a Node.js Lambda function to recursively call another function in order to scan an entire DynamoDB table using a keyword search.
  2. You can download the example lambda code here.


About Joe
Find Out Now!