With a 33% market share, Amazon Web Services remains the biggest cloud provider on the planet. Organizations across industries use AWS to build secure and scalable digital environments.
However, as businesses scale cloud adoption, operating costs become a natural concern. Fortunately, there are several popular strategies for AWS cost optimization that allow your business to manage cloud spending in a responsible way.
In this post, we’ll share popular strategies for reducing your AWS cost without affecting application performance.
Let’s start by exploring the resources that create a significant cost burden. One such service is EC2 (Elastic Cloud Compute). We definitely need a solution that stops a redundant EC2 instance or modifies its instance type to a lower price type. RDS, EBS volumes, and AIML services like Sagemaker can also pile up your AWS costs.
Here are a few techniques to help you tackle the usage of the above resources.
Native Tools to Monitor and Regulate Consumption
Before getting into the nitty-gritty of the strategies, it’s important to understand the existing state of your infrastructure. You can analyze your spending and immediately limit further costs by accessing billing information and usage records.
AWS provides a few native tools that can help you gather cost data and system metrics to identify cost-related inefficiencies in your setup:
These tools evaluate your bill to forecast future spending and provide a thorough report on where your money went. This information is especially useful for budget plans.
Tagging Compliance to Track Your Spending Better
Using an AWS tag is a pre-emptive step toward cost optimization. A tag is a label that you or AWS apply to an AWS resource. You can arrange the resources in use by tags to see who is utilizing which AWS service and how.
A strong IT governance policy that requires all cloud-based resources to be tagged is another way to prevent the creation of rogue infrastructure and to make unauthorized resources easy to detect.
Cost Allocation, which allows you to categorize and track your AWS expenditures, can also be done using tags. This lets you associate costs with technical or security dimensions, such as specific applications, environments, or compliance programs.
Simplifying your S3 Lifecycle configurations using object tags will be most helpful if you currently have tens or hundreds of rules in your lifecycle configuration filtered through your prefixes. We recommend consolidating those rules by using object tags.
The ability to split down the AWS charges by tag is supported by AWS Cost Explorer. Check out this post by AWS for more information on using cost allocation tags.
Rightsizing the Infrastructure Setup
The pricing of most cloud providers is complex. It appears straightforward at first (typically based on basic parameters like $/GB/month, $/hour, or, more recently, $/second), but when your cloud infrastructure grows and includes multiple regions and services, you’ll find it difficult to keep track of the ever-increasing cost.
For example, in a development environment, two “t2.medium” instances might be operating, whereas four “c4.xlarge” instances might be running in production. Although it saves money, we still must run extra NAT Gateways per Availability Zone that were deployed as part of the main VPC configuration. Data transfer across Availability Zones is not free. So, what are the cost-cutting options?
Consider the following pointers to evaluate your configurations against some rightsizing options:
- Evaluate if you need the same VPC setup in all situations. For lesser settings, would a single AZ be enough? Do production and performance environments require multiple AZs?
- Evaluate if you are using the correct instance type. In the above example, we can save 20% cost by changing the generation from t2 to t3 and up to 40% cost by choosing t3a series instances. Or we can reduce cost by 35% by choosing third-generation AMD instance types. Always check pricing before enabling AWS Compute Optimizer for automated findings.
- Use Auto Scaling to scale your application based on demand. Analyze the result using describe-scaling-activity to see if the scaling policy can be tuned to add instances less aggressively. Auto Scaling is one of the most effective ways to control AWS costs. Schedule it with CloudWatch Event Rules for an automated stop/start of the instance when not in use or use AWS Instance Scheduler.
- Use the Trusted Advisor Idle Load Balancers Check to get a report of load balancers that have a request count of less than 100 over the past seven days. Then, you can delete these load balancers to reduce costs. Additionally, you can also review data transfer costs using Cost Explorer. For better insight, DevOps engineers should check with the development and business teams to check if these resources can be safely decommissioned.
- Evaluate if you are using the latest EBS volume type. For e.g., upgrading from gp2 to gp3 volumes can save up to 20%. Amazon EBS gp2 volumes are simple to use, but their performance is coupled with provisioned size that increases linearly with the volume size. On the other hand, gp3 provides predictable 3,000 IOPS baseline performance and 125 MiB/s, regardless of volume size. With gp3 volumes, you can provision IOPS and throughput independently without increasing storage size. As a result, conversion to gp3 definitely helps in the cost-optimization of volumes. EBS volumes that have very low activity (less than 1 IOPS per day) over a period of 7 days indicate that they are probably not in use. Identify these volumes using the Trusted Advisor Underutilized Amazon EBS Volumes Check. To reduce costs, first snapshot the volume (in case you need it later), then delete these volumes. You can automate the creation of snapshots using the Amazon Data Lifecycle Manager.
- Evaluate whether you really need a NAT Gateway. If your workload requires NAT Gateway to access AWS services such as S3, DynamoDB, SES, and others, install VPC Endpoints instead. It can reduce costs on NAT Gateway’s hourly rates and data processing hours. If the number of interface type VPC endpoints is high, conduct a thorough cost analysis to ensure you are not overpaying for VPC endpoints.
- Evaluate EC2 Pricing Model. By opting for the EC2 Savings Plan, businesses can reduce costs by up to 75% compared to on-demand pricing. And by opting to spot based instance pricing, they can save up to 90%. Compute Savings Plans automatically apply to EC2 instance usage regardless of instance family, size, AZ, region, OS, or tenancy, and also apply to Fargate and Lambda usage.
- Terminate orphaned EC2 instances and volumes to save cost. Look at the CloudWatch dashboard and estimate past usage and use AWS Cost Explorer Resource Optimization to get a report of EC2 instances that are either idle or have low utilization. You can reduce costs by using AWS Instance Scheduler to automatically stop instances or scale down the ASGs requirement to a minimum. Use AWS Operations Conductor to automatically resize the EC2 instances.
- For reducing the cost on AWS Sagemaker notebook instances, pause the instances that you don’t need for performing modeling jobs. You can do this manually or by using compute services. You can also try finding the RDS in ‘available’ state by creating your own custom lambdas and then scheduling a time to stop instances or databases that are not attached to any other resources.
Using AWS S3 Storage the Right Way
S3, the Simple Storage Service, is one of AWS’s most well-known and widely utilized services. However, it can have adverse consequences if not adjusted properly. While we’re not going to focus on security here, there are still a few factors to keep in mind when storing items in S3:
- Should I version my objects? Version objects consume storage and so contribute to the overall cost. So, if versioning isn’t necessary, don’t enable it. Even if enabled, use appropriate lifecycle rules to automate the removal of expired versions.
- Choose the right storage class for your objects and if possible, choose Intelligent Tier storage. Analyze the object usage using S3 storage Class Analysis and decide on the storage type — standard, infrequent access, etc., as each has its pricing.
- Also, remember to use the S3 life cycle for objects with object tags filter. Transitioning from one storage class to the next, and finally to Glacier, will surely save money.
For static websites or frontend, it is preferable to choose S3 instead of EC2 to optimize cost as S3 offers higher availability than EC2 at a lower cost.
Shifting Toward a Serverless Stack
AWS offers technologies for running code, managing data, and integrating applications, all without managing servers. Serverless technologies feature automatic scaling, built-in high availability, and a pay-for-use billing model to increase agility and optimize costs. This also eliminates infrastructure management tasks like capacity provisioning and patching; so you can focus on writing code that serves your customers. Serverless applications start with AWS Lambda, an event-driven compute service natively integrated with over 200 AWS services and software as a service (SaaS) applications.
In addition to cost savings, serverless architecture also helps you create a more secure and scalable infrastructure, improving the overall developer experience.
There are a variety of solutions for AWS cost optimization. AWS tags are handy to establish custom filters over cost reports and play a significant role in making resource utilization transparent. Furthermore, serverless techniques can help you save money by right sizing and re-architecting your application. You can also pay less for licensing and migrate to open-source databases.
But, this isn’t all.
AWS cost optimization is a continuous process. You should continuously monitor your resource usage and status to make sure you only pay for the assets you need. It’s useful to set up a budget using AWS Budgets, so that you get alerted when your cost and usage changes. It is equally critical to invest in a cost-conscious culture and understand the AWS Shared Responsibility Model.
- Upload Files to AWS S3 Using a Serverless Framework
Today, we will discuss uploading files to AWS S3 using a serverless architecture. We will…
- Financial Companies and the Cloud: 5 Major Trends
Spurred by the pandemic, financial companies have accelerated digital transformation, including cloud computing. Among many…