Written by: Damir Durmo, Senior Software Developer, FWI
According to Martin Fowler:
"Serverless architectures are application designs that incorporate third-party “Backend as a Service” (BaaS) services, and/or that include custom code run in managed, ephemeral containers on a “Functions as a Service” (FaaS) platform. By using these ideas, and related ones like single-page applications, such architectures remove much of the need for a traditional always-on server component. Serverless architectures may benefit from significantly reduced operational cost, complexity, and engineering lead time, at a cost of increased reliance on vendor dependencies and comparatively immature supporting services."
As we began work on the Device Management module of FWI Cloud, we decided to adopt a serverless architecture (specifically focused on AWS Lambda, AWS API Gateway and functions written in both NodeJS and Python). Having developed previous Cloud modules using traditional microservices, we knew we would have to grapple with some growing pains. I want to share some of those challenges in the hope that it will make your journey into serverless smoother.
The reason for our move to serverless architecture deserves a blog post of its own; one which I hope to write soon.
AWS Tooling is not ready
We started with several AWS tools, including AWS CodeStar, CodeBuild, CodePipeline, CodeCommit and Cloud. Through our experiences, we ran into several challenges:
- CodeCommit did not allow PR reviews and approvals
- CodePipeline was a slow build process
- We were unable to package endpoints/functions independently without manual scripting
After a call with AWS sometime in April 2018, we realized we needed to consider other options, which is the next lesson.
Automate serverless application deployment with Serverless Framework
One and half months into our project, we switched to using Serverless Framework for the reasons mentioned above. We were able to completely remove all our custom scripts.
Some of the things we liked about Serverless Framework:
- Robust CLI - Provider agnostic
- Individual packaging of Lambda functions with shared code
- Extensible with plugins
- Vibrant and helpful community
- Very good documentation
Leverage Serverless Framework plugins
Do not write any custom scripts unless you absolutely have to. Leverage plugins for Serverless Framework. If you have to write custom scripts, write them as a Serverless Framework plugin; which are easy to develop.
Some of our favorite plugins:
- Serverless plugin to bundle Python packages
- Use CloudFormation Pseudo Parameters in your Serverless project
- Serverless plugin for managing custom domains with API Gateways
- Serverless plugin to add documentation and models to the Serverless generated API Gateway
- Serverless-plugin-tracing to enable AWS X-Ray tracing
- Serverless-plugin-Lambda-dead-letter to configure a Lambda with a dead letter queue or topic
Package your Lambdas individually with shared code
Instead of the default packaging of all Lambda functions together, we decided to package the functions individually with shared code into Nanoservices. Nanoservices are defined as a single action/responsibility function. This will result in better application performance while decreasing the code size. It will help your Lambda containers to allocate faster which will improve initial load time.
This approach allows you to deploy your endpoints independently. Serverless Framework keeps all these individual functions nicely organized and properly named (e.g., Dev-Device-Management-GetDevices).
Use Lambda Integration in AWS API Gateway
Lambda Integration is easy to set up and fits all of our business needs. This allows us to fully integrate with Serverless Framework.
While Lambda Integration provides more control, it involves a lot of work in setting up the integration request template, response template and status code pattern. This required us to learn Velocity Template Language, which we didn't enjoy.
NodeJS serverless functions are faster to develop (because we have node developers)
We found that developing NodeJS functions for web application backends was a lot easier and faster than developing with Python. The reason for this probably lies in the fact we have really good JS developers on our team. Python packaging disappointed us when compared to npm. We love npm for the ability to verify the popularity of the package by checking the number of downloads. We also like that in package.json we can easily split dependencies from dev dependencies. It was straightforward to split local packages from global packages. Python packaging for serverless applications is done through the serverless plugin, mentioned above.
Cloud providers still lack "warm" Lambda starts as out of the box solution
When running a serverless function, it will stay active (a.k.a. hot) as long as you're running it. This blog post explains it well. Your container stays alive, ready and waiting for execution. After a period of inactivity, your cloud provider will drop the container, and your function will become inactive, (a.k.a. cold). A cold start happens when you execute an inactive function. There's an extra delay on top of that if your Lambdas are VPC enabled. The delay comes from your cloud provider provisioning your selected runtime container and then running your function.
Consider serveless-plugin-warmup as a solution. This approach is at best "hacky," if not complicated. The cloud providers need to come up with a more long-term, out of the box solution.
Be aware of concurrency limits
According to AWS:
By default, AWS Lambda limits the total concurrent executions across all functions within a given region to 1000
You can request the limit to be increased but this limit is something that you should be aware of. Make sure that the results of the load testing incorporate these limits.
We've learned valuable lessons along the way, and we are still iterating. At FWI, we encourage the "fail fast" approach that allows us to pivot quickly and efficiently. After a few months of getting the right deployment/development stack in place, we are delivering projects in a 100% serverless fashion. The initial struggles were worth the journey.
The serverless ecosystem is not perfect and has room for growth. While it is still a green field in some areas, we are enthusiastic about the future and happy with the current payoffs.