Cost saving with AWS Fargate On-Demand
A service company that we consult faces the challenge of managing hundreds of container-based application environments. They have tens of different tenants, each with multiple Dev, Stage, and Production environments.
They are looking for a cost-optimized infrastructure solution based on containers to enable them to iterate fast and improve the testing and development experience. The client is looking for a solution with low maintenance costs.
Our initial recommendation is AWS App Runner - it is a fully managed container application service that lets one build, deploy, and run containerized web applications and API services without prior infrastructure or container experience. While our customer loved App Runner, they have a requirement for Web Sockets, which is still under development by AWS.
We achieved the desired solution, and we have built a custom solution that keeps the core benefits of App Runner - ease of use, automated deployment, low maintenance cost, and cost-optimized and supports web sockets. The solution utilizes AWS Lambda and Amazon CloudFront to implement an on-demand start and stop mechanism for container-based deployment based on AWS Fargate. The start and stop of the environment are fully automated and invisible to the client.
Services in use
Amazon CloudFront is a high-performance, secure, developer-friendly content delivery network (CDN) service.
Lambda@Edge - a feature of Amazon CloudFront that lets one run code closer to users of the application, improving performance and reducing latency.
AWS Certificate Manager - used to issue and manage SSL certificates for free.
Start on demand
The scale-up mechanism begins with a client request to the application.
The request is handled by Amazon CloudFront with Lambda@Edge, where the Lambda function is executed at each request; The Lambda function uses the AWS API to initiate an AWS Fargate Service to scale to 1 container.
When the container starts, an event is published to Amazon EventBridge and triggers a Lambda function. This Lambda function updates a low TTL DNS record used as an origin by Amazon CloudFront.
The container is started, CloudFront uses the updated DNS record, and the application is fully functional.
Stop on demand
To keep the cost as low as possible, we need a mechanism to scale down when the application is not in use.
Amazon CloudFront publishes metrics to Amazon CloudWatch. The metric for requests to the distribution can be used, and if there are no requests, the application can be scaled down to 0. Amazon CloudWatch Alarm is configured to trigger a Lambda function which uses the AWS API to stop the environment.
The solution is fully automated and uses the Infrastructure as Code approach with AWS CDK, ensuring consistent deployment across multiple environments.
The full source code can be found here - https://github.com/SeveralClouds/fargate-on-demand
Future enhancements and improvements
Consider using AWS Fargate Enables Faster Container Startup using Seekable OCI - running my sample application with SOCI indexes on AWS Fargate is approximately 50% faster compared to running without SOCI indexes. https://aws.amazon.com/blogs/aws/aws-fargate-enables-faster-container-startup-using-seekable-oci/
AWS Application Load Balancer with dynamic targets can be used instead of Amazon CloudFront and Amazon Route53. This implementation improves the security posture by keeping the container deployments in a private network.
Considering the upcoming cost of public IPv4 addresses, IPv6 adoption could be explored as a forward-looking measure to optimize costs and further enhance the solution's effectiveness in AWS Fargate deployments. At this point, AWS Fargate only supports dual stack (IPv4 and IPv6).
The solution is fully event-driven and cost-effective. The on-demand strategy of using AWS Fargate containers provides the added benefits of cost optimization and efficient resource utilization, making it an ideal choice for testing and QA environments where temporary infrastructure is needed.
By utilizing AWS-managed services, the serverless approach requires minimal maintenance and support, reducing long-term operational overhead. This empowers our clients to adopt a secure, cost-effective, and effortlessly scalable architecture.