Featured Post

14 Top Data Pipeline Key Terms Explained

Image
 Here are some key terms commonly used in data pipelines 1. Data Sources Definition: Points where data originates (e.g., databases, APIs, files, IoT devices). Examples: Relational databases (PostgreSQL, MySQL), APIs, cloud storage (S3), streaming data (Kafka), and on-premise systems. 2. Data Ingestion Definition: The process of importing or collecting raw data from various sources into a system for processing or storage. Methods: Batch ingestion, real-time/streaming ingestion. 3. Data Transformation Definition: Modifying, cleaning, or enriching data to make it usable for analysis or storage. Examples: Data cleaning (removing duplicates, fixing missing values). Data enrichment (joining with other data sources). ETL (Extract, Transform, Load). ELT (Extract, Load, Transform). 4. Data Storage Definition: Locations where data is stored after ingestion and transformation. Types: Data Lakes: Store raw, unstructured, or semi-structured data (e.g., S3, Azure Data Lake). Data Warehous...

Here's Quick Guide on Hadoop Security

Here is a topic of security and tools in Hadoop. These are security things that everyone needs to take care of while working with the Hadoop cluster.


Quick guide on Hadoop security and its features with top references.

Hadoop Security

Security

  • We live in a very insecure world. For instance, your home's front door to all-important virtual keys, your passwords, everything needs to be secured. In Big data systems, where humongous amounts of data are processed, transformed, and stored. So security you need for the data.
  • Imagine if your company spent a couple of million dollars installing a Hadoop cluster to gather and analyze your customers' spending habits for a product category using a Big Data solution. Here lack of data security leads to customer apprehension.

Security Concerns

  • Because that solution was not secure, your competitor got access to that data, and your sales dropped 20% for that product category.
  • How did the system allow unauthorized access to data? Wasn't there any authentication mechanism in place? Why were there no alerts?
  • This scenario should make you think about the importance of security, especially where sensitive data is involved.
  • Hadoop has inherent security concerns due to its distributed architecture. The installation that has clearly defined user roles and multiple levels of authentication (and encryption) for sensitive data will not let any unauthorized access go through.

Hadoop Security.

  • When talking about Hadoop security, you have to consider how Hadoop conceptualized. When Doug Cutting and Mike Cafarella started developing Hadoop, security was not the priority.
  • Hadoop meant to process large amounts of web data in the public domain, and hence security was not the focus of development. That's why it lacked a security model and only provided basic authentication for HDFS—which was not very useful since it was easy to impersonate another user.
  • Another issue is that Hadoop was not designed and developed as a cohesive system with pre-defined programs. But it was as a collage of modules that either correspond to various open-source projects or a set of (proprietary) extensions developed by different vendors to supplement functionality lacking within the Hadoop ecosystem.
  • Therefore, Hadoop expects a secure environment for data processing. In real-time, there are some glitches to have secure processing. You can read more about it from references.

Keep Reading
White Papers

Comments

Popular posts from this blog

How to Fix datetime Import Error in Python Quickly

SQL Query: 3 Methods for Calculating Cumulative SUM

Big Data: Top Cloud Computing Interview Questions (1 of 4)