Featured Post

14 Top Data Pipeline Key Terms Explained

Image
 Here are some key terms commonly used in data pipelines 1. Data Sources Definition: Points where data originates (e.g., databases, APIs, files, IoT devices). Examples: Relational databases (PostgreSQL, MySQL), APIs, cloud storage (S3), streaming data (Kafka), and on-premise systems. 2. Data Ingestion Definition: The process of importing or collecting raw data from various sources into a system for processing or storage. Methods: Batch ingestion, real-time/streaming ingestion. 3. Data Transformation Definition: Modifying, cleaning, or enriching data to make it usable for analysis or storage. Examples: Data cleaning (removing duplicates, fixing missing values). Data enrichment (joining with other data sources). ETL (Extract, Transform, Load). ELT (Extract, Load, Transform). 4. Data Storage Definition: Locations where data is stored after ingestion and transformation. Types: Data Lakes: Store raw, unstructured, or semi-structured data (e.g., S3, Azure Data Lake). Data Warehous...

The Growth of Machine Learning till TensorFlow

The Internet and the vast amount of data are inspirations for CEOs of big corporations to start to use Machine learning. It is to provide a better experience to users.

How TensorFlow Starts

Let us take Amazon, online retail that uses Machine learning. The algorithm's purpose is to generate revenue. Based on user search data, the ML application provides information or insights.

The other example is the advertising platform where Google is a leader in this line. Where it shows ads based on the user movements while surfing the web. These are just a few, but there are many in reality.

TensorFlow is a new generation framework for Machine Learning developers. Here is the flow of how it started.
Machine Learning


Evolution

Evolution of TensorFlow

Top ML Frameworks

Torch

  • The torch is the first framework developed in 2002 by Ronan Collobert. Initially, IBM and Facebook have shown much interest.
  • The interface language is Lua.
  • The primary focus is matrix calculations. It is suitable for developing neural networks.

Theano

  • It is developed in 2010 by the University of Montreal. It is highly reliable to process graphs (GPU).
  • Theano stores operations in a data structure called a graph, which it compiles into high-performance code. It uses Python routines.

Caffe

  • This framework is much popular in processing Image recognition.
  • Caffe is written in C++.
  • It is popular in Machine Learning and Neural networks.

Keras

  • It is well known for developing neural networks. 
  • The real advantages or simplicity and easy development.
  • François Chollet created Keras as an interface to other machine learning frameworks, and many developers access Theano through Keras to combine Keras's simplicity with Theano's performance.

TensorFlow

This is developed by Google in 2015. You can use TensorFlow on Google cloud. It supports Python heavily. The core functions of this framework developed in .C++

Takeaways.

  1. The story of Machine Learning started in the 18th century.
  2. Python is the top interface language in the major ML frameworks.
  3. Python is the prime language you need for 20th-century Data science projects.

Comments

Popular posts from this blog

How to Fix datetime Import Error in Python Quickly

SQL Query: 3 Methods for Calculating Cumulative SUM

Big Data: Top Cloud Computing Interview Questions (1 of 4)