Featured Post

14 Top Data Pipeline Key Terms Explained

Image
 Here are some key terms commonly used in data pipelines 1. Data Sources Definition: Points where data originates (e.g., databases, APIs, files, IoT devices). Examples: Relational databases (PostgreSQL, MySQL), APIs, cloud storage (S3), streaming data (Kafka), and on-premise systems. 2. Data Ingestion Definition: The process of importing or collecting raw data from various sources into a system for processing or storage. Methods: Batch ingestion, real-time/streaming ingestion. 3. Data Transformation Definition: Modifying, cleaning, or enriching data to make it usable for analysis or storage. Examples: Data cleaning (removing duplicates, fixing missing values). Data enrichment (joining with other data sources). ETL (Extract, Transform, Load). ELT (Extract, Load, Transform). 4. Data Storage Definition: Locations where data is stored after ingestion and transformation. Types: Data Lakes: Store raw, unstructured, or semi-structured data (e.g., S3, Azure Data Lake). Data Warehous...

AWS elastic cloud EC2 top tutorial

Amazon Elastic Compute Cloud (Amazon EC2) - is a web service that provides resizable compute capacity in the cloud. It is designed to make web-scale cloud computing easier for developers. Amazon EC2ā€™s simple web service interface allows you to obtain and configure capacity with minimal friction.

Elastic Cloud
Photo Credit: Srini

What is Elastic Cloud

Functions of EC2

  • It provides you with complete control of your computing resources and lets you run on Amazonā€™s proven computing environment. 
  • Amazon EC2 reduces the time required to obtain and boot new server instances to minutes, allowing you to quickly scale capacity, both up and down, as your computing requirements change. 
  • Amazon EC2 changes the economics of computing. That means, pay for the usage you did.
  • Amazon EC2 provides developers the tools to build failure resilient applications and isolate themselves from common failure scenarios.

Key Points for Interviews

Learn these basic points on EC2 for your next interview.
  • EC2 is the basic fundamental block in AWS.
  • EC2 provides remote operations of virtual machines (VM) on Amazonā€™s infrastructure.
  • A single VM you can call an instance. Multiple instance types present in EC2.
  • A micro instance is the only EC2 instance type that is free and is also the most underpowered instance (613MB memory). 
  • The EC2 micro instance type is the least reliably provisioned; when demand on Amazonā€™s infrastructure is high, the micro instance gets the lowest priority.

Comments

Popular posts from this blog

How to Fix datetime Import Error in Python Quickly

SQL Query: 3 Methods for Calculating Cumulative SUM

Big Data: Top Cloud Computing Interview Questions (1 of 4)