In this tutorial, we will see what are an Elastic Cloud Compute (EC2) and the detailed use of Elastic Cloud Compute in amazon web services.
Before getting to understand what it is first we will see when comes to market so, it was created by Amazon around 2006 as the premier service to solve computing needs.
It has since become the foundational building block for most applications utilizing AWS, as well as the basis for many other services.
Amazon has added many features to Elastic Cloud Compute and rapidly expanding its reliability, usefulness, and robustness of tool that fits an impressive number of use cases.
What is Elastic Cloud Compute?
Elastic Compute Cloud (EC2) is a cloud computing service that can be used to run collections of virtual servers all over the world, also known as "elastic" compute capacity.
This elastic capacity is useful for web applications, data processing, and hosting services. An EC2 instance is essentially an Amazon Machine Image (AMI) that users can launch in minutes and pay only for the time they use it.
Elastic Cloud Compute is the latest addition to Amazon Web Services (AWS) that provides compute power on demand.
EC2 instances may be configured with different operating systems, out-of-the-box software, and specialized programming languages like Java or PHP.
These EC2 instances are also ideal for any type of application deployment because they handle various workloads.
Functionality of Elastic Cloud Compute
First, understand the name Elastic Cloud Compute because it produces amazing insight into how the service actually works.
EC2 is a service that can be used for computing purposes, like running an application, and it's located in the cloud, in other words, not your closet.
As mentioned earlier when we discussed scalability, but EC2 is very convenient in this respect due to the fact that rules can be set so the scaling up or scaling down can be done automatically without any manual work.
1. Image for the EC2 instance.
This is the fundamental code that will be running on an instance. This is not your application code.
Let me give some examples to further explain. Here are the Amazon machine images currently available when you create a new EC2 instance through the web console.
At the top is Amazon’s own flavor of Linux followed by Red Hat Linux, SUSE Linux, and Ubuntu. Microsoft Windows is also available, and you can see that there are different versions of everything mentioned easily available.
So you can see that an image is a combination of an operating system and then some applications preinstalled, typically things like Java or Python or the AWS CLI tools.
Amazon provides, manages, and updates a selection of images that are available and Just a caveat, however, once an instance is created with an image, Amazon won’t be updating that instance’s software.
They just update the available image. If you need to update your instance for security patches, you’ll need to either manually make the updates or create a new instance and migrate any application code.
2. Compute Optimized Instance
The compute-optimized instance Gives the same number of CPUs as the general in EC2, and it decreases the memory allocation in half.
You might think that makes compute-optimized the lesser of these, but with a difference in features comes a difference in price, And the compute-optimized family instances are the cheapest of them all.
As you can imagine, increasing the type from large to extra-large or down to medium will give you more or less of each feature at an equivalent ratio.
It’s important to consider the best instance type for your application. It can not only help your application perform better, but it can also save you a lot of money.
Once the image and instance type has been decided, you’re provided with a page of instance details that relate to security roles and further fine‑grain configuration.
This is a place where you can assign a certain number of instances to replicate with this same image and type.
3. Auto Scaling in EC2
AWS will create an auto-scaling group that will increase or decrease the number of instances according to the rules that you set yourself. Once the auto-scaling group has been created, you can modify it further.
If you're working with a third‑party image or one you made yourself, this is the perfect way to set upscaling.
If you’re curious about how to have an application that you’ve installed on an instance configured to auto-scale, this is actually easiest to accomplish with the application service Elastic Beanstalk, and there’ll be more on that in the next module.
4. Elastic Block Storage
It is storage named using a term called Elastic Block Storage, this is basically a storage service that operates very easily and calculates the costs for storage.
Similarly, Elastic Block Storage is not the same as Simple Storage Service it is specifically for use with EC2; whereas, S3 is for both storing and serving up independent files.
Here you can add and adjust the root volume size for the Elastic Block Storage volumes attached to this EC2 instance.
We will skip over tags, which are just Meta values you can add to the instance for your own purposes, and move to the next section.
5. Security Group Configuration
If you remember from the last module, security groups can be thought of as little firewalls configured on a per‑instance basis.
Put simply, a security group controls which IPs your instance can talk to and which IPs can talk to it and try to avoid keeping instances open to the SSH sessions from anywhere.
You could also allow your instances to talk to each other or, say, a database or accept incoming requests on port 80.
Using several ways you can utilize security groups, and after creation, you can edit them and add more rules on them.
The final step is to create the instance with an existing key pair, which will allow you to SSH into the instance and make whatever modifications you want.
You can also choose not to use a key pair if there is an existing application on the instance image that will prompt you for a password.
What is a pricing structure for EC2?
Amazon charges for EC2 instances by the hour, and the amount per hour differs based on the instance type and image that you've selected.
Few images e.g Windows images can ask more price than a Linux image and the reason is that imaging software is mostly licensed based.
Let’s look at the pricing of a large memory‑optimized instance running Amazon’s Linux image. The per‑hour rate is 12.6 cents, which gives it a daily rate of around $3 per day.
$4 per day might be a little steep for your case, so let’s look at one of the most basic offerings, a general-purpose t2.micro instance.
This instance type comes with 1 CPU, 1 GB of RAM, and costs 1.3 cents per hour, which adds up to around 31 cents a day. Now we’re talking.
For a month, that’s around $10, which is fairly reasonable. Amazon has many, many options for pricing I was only talking about on‑demand pricing.
There are other models like reserved instance and spot instance pricing that both offer lower prices than on‑demand pricing.
It can all get a little confusing, I know, which is why the AWS calculator is so useful. Also, keep in mind that Amazon offers a free tier of usage for many services.
Elastic cloud compute is a scalable and instantaneously available computing resource. It can be used to process large data, conduct intensive calculations and host web applications.
The elastic cloud compute has been designed to scale quickly so you don’t have to worry about infrastructure growing too fast or too slow.
Elastic compute cloud computing is a utility-based service that enables you to have as much or as little computing power as needed for your project.
It’s so flexible because it scales from a single instance to thousands of servers. You can also choose a particular hardware configuration and OS that suits your needs, and provision the server in seconds.
Presenting the Data Engineer Team, a dedicated group of IT professionals who serve as valuable contributors to analyticslearn.com as authors. Comprising skilled data engineers, this team consists of adept technical writers specializing in various data engineering tools and technologies. Their collective mission is to foster a more skillful community for Data Engineers and learners alike. Join us as we delve into insightful content curated by this proficient team, aimed at enriching your knowledge and expertise in the realm of data engineering.