top of page
  • Yoanna Krasteva

Cost saving with AWS Fargate On-Demand

Customer Challenge

A service company that we consult faces the challenge of managing hundreds of container-based application environments. They have tens of different tenants, each with multiple Dev, Stage, and Production environments.

They are looking for a cost-optimized infrastructure solution based on containers to enable them to iterate fast and improve the testing and development experience. The client is looking for a solution with low maintenance costs.


Our initial recommendation is AWS App Runner - it is a fully managed container application service that lets one build, deploy, and run containerized web applications and API services without prior infrastructure or container experience. While our customer loved App Runner, they have a requirement for Web Sockets, which is still under development by AWS.

We achieved the desired solution, and we have built a custom solution that keeps the core benefits of App Runner - ease of use, automated deployment, low maintenance cost, and cost-optimized and supports web sockets. The solution utilizes AWS Lambda and Amazon CloudFront to implement an on-demand start and stop mechanism for container-based deployment based on AWS Fargate. The start and stop of the environment are fully automated and invisible to the client.

Services in use

  • AWS Fargate - a technology that one can use with Amazon ECS to run containers without having to manage servers or clusters of Amazon EC2 instances.

  • Amazon CloudFront is a high-performance, secure, developer-friendly content delivery network (CDN) service.

  • Lambda@Edge - a feature of Amazon CloudFront that lets one run code closer to users of the application, improving performance and reducing latency.

  • AWS Certificate Manager - used to issue and manage SSL certificates for free.

Start on demand

  1. The scale-up mechanism begins with a client request to the application.

  2. The request is handled by Amazon CloudFront with Lambda@Edge, where the Lambda function is executed at each request; The Lambda function uses the AWS API to initiate an AWS Fargate Service to scale to 1 container.

  3. When the container starts, an event is published to Amazon EventBridge and triggers a Lambda function. This Lambda function updates a low TTL DNS record used as an origin by Amazon CloudFront.

    1. While the container is starting, Amazon CloudFront has a native timeout (10 seconds by default) and retry mechanism (3 times by default) If the container had not started in the defined time period, then Amazon CloudFront returns HTTP 504 status code (Gateway Timeout). As we want the container to start to be transparent to the client, we have implemented a Lambda@Edge, that modifies the response and sends HTTP 200 OK with a custom JavaScript for the browser to refresh the page.

  4. The container is started, CloudFront uses the updated DNS record, and the application is fully functional.

Stop on demand

To keep the cost as low as possible, we need a mechanism to scale down when the application is not in use.

Amazon CloudFront publishes metrics to Amazon CloudWatch. The metric for requests to the distribution can be used, and if there are no requests, the application can be scaled down to 0. Amazon CloudWatch Alarm is configured to trigger a Lambda function which uses the AWS API to stop the environment.

Source code

The solution is fully automated and uses the Infrastructure as Code approach with AWS CDK, ensuring consistent deployment across multiple environments.

The full source code can be found here -

Future improvements

Future enhancements and improvements

  • Considering the upcoming cost of public IPv4 addresses, IPv6 adoption could be explored as a forward-looking measure to optimize costs and further enhance the solution's effectiveness in AWS Fargate deployments. At this point, AWS Fargate only supports dual stack (IPv4 and IPv6).


The solution is fully event-driven and cost-effective. The on-demand strategy of using AWS Fargate containers provides the added benefits of cost optimization and efficient resource utilization, making it an ideal choice for testing and QA environments where temporary infrastructure is needed.

By utilizing AWS-managed services, the serverless approach requires minimal maintenance and support, reducing long-term operational overhead. This empowers our clients to adopt a secure, cost-effective, and effortlessly scalable architecture.



Commenting has been turned off.
bottom of page