Serverless software architecture – the golden goose to many and just a fad to others. This controversial architecture hasn’t been around for long and like many software engineers I stumbled upon it when Amazon Web Services (AWS) announced Lambda about 5 years ago… and it was love at first sight.
What’s This Serverless You Speak Of?
In part 1 of this blog series, I cover what serverless is, what it isn’t, and some of my serverless lovers spats over the years. In part 2 I’ll dig into an example serverless front end and in part 3 I’ll explore an example serverless back end.
First things first, let me address the misnomer and elephant in the room: serverless does not mean computing without servers. That would be silly, but the name is provocative and that’s likely why it stuck.
As it’s used today, serverless computing means that the service provider manages the computing resources (e.g. keeps the firmware up to date) and the resources scale up and down automagically (horizontally and/or vertically to meet demand). No more; no less. But alas, the devil is in the details.
The Golden Child
In a perfect world, the decision to go serverless would be a no brainer. Serverless would mean computing resources:
- Scale down to zero
- Scale up to infinity
- Scale instantly
- Start instantly
- Work for all languages, frameworks and technologies
- Run anywhere
- Cost money only when in use
This is the promise of AWS Lambda and on the surface it is fairly accurate. With AWS Lambda you are able to execute many aspects of serverless full stack architecture, including:
- Server-side rendering (SSR) of single page applications (SPAs) at locations close to the end user via AWS’s edge network (CloudFront)
- Gluing miscellaneous components together via triggers and step functions
- Scalable, on-demand backends via API Gateway
With regard to data management, AWS offers multiple serverless services, including: an object storage solution S3, a SQL solution Aurora Serverless and a NoSQL solution DynamoDB. Each of these solutions provide scalable, on demand storage of data.
The grass is always green, it’s always sunny and serverless is perfect … or is it?
The Red-Headed Stepchild
As I stated above, I’ve fallen in love with serverless, but that doesn’t mean I’ve kept my blinders on. We do not live in a perfect world and serverless is most certainly not perfect. The following are just a couple of the lovers spats I’ve had with serverless over the years:
My ongoing friction with serverless is with regard to the data management offerings. Lambda doesn’t work well in VPCs (so you can’t efficiently query an RDS database), DynamoDB has very limited search functionality (it’s a key-value store that barely supports search in a non-scalable, costly manner) and Aurora Serverless can’t scale up from zero in a timely fashion.
There are many workarounds to the aforementioned problems, but most are costly and/or no longer serverless. My preferred workaround is to use Aurora Serverless (via the data API and AppSync) with pausing disabled in production environments. It isn’t free when the servers aren’t in use, but otherwise it still affords all of the other benefits of the serverless architecture. Part 3 of this series will cover this approach in detail.
Although AWS doesn’t like to talk about it, Lambda does not cold start instantly. In fact, depending on the language and memory size of the function, the cold start time could be rather substantial. This is especially important for a Lambda@Edge function that directly impacts request and response times. As it stands, Lambda@Edge presently only supports Node.js and Python. It would be a huge win for serverless if AWS would open Lambda@Edge up to all languages (as it has done with non-Edge Lambda), including those that don’t have to load a heavy engine or runtime (like shell).
Another nit: Lambda@Edge functions are deployed and destroyed extremely slowly and throw quite a wrench into CI/CD pipelines.
Serverless is awesome and is production ready across the entire full stack architecture, but it is not without its faults. That said, I’m head over heels for it and would strongly encourage everyone to give it a test drive when you have a chance. Stay tuned for part 2 of this serverless blog series where we’ll dive into an example serverless front end and provide you with everything you need to spin up an environment of your own.
- Promoting New Blog Entries with AWS Lambda and Slack
Here at Modus Create, we're avid users of Slack as a team collaboration tool and…
- Redirect Requests To A Domain With AWS Lambda
Lambda is a serverless computing platform that allowscoding in C# (.NET Core), Go, Java, Node.Js,…