Machine Learning Fundamentals

Machine Learning Fundamentals

Machine Learning is motivating computing machines (chill, I am talking about computers) to program themselves. It is type of artificial intelligence. If program writing is considered as an automation, then ML is automating the procedure of automation. Composing programming is the bottleneck, we don’t have enough great engineers.

Give the information a chance to take the necessary steps rather than individuals. Machine learning is the best approach to make programming adaptable. It is like farming or gardening. Assume algorithms as seeds, data represents nutrients, and you are the gardener and programs are supposed to be plants.

The differences between conventional programming and ML are:
1. Conventional Programming: Data and program is executed on the computer to generate results.
2. Machine Learning: Data and results re executed on the computer to produce a program. This generated program can be further used in conventional programming.

ML

Variants:
There are four variants of machine learning as:
1. Supervised/ inductive learning/Classification: Training data consists of required outputs as labels/names of categories. For example, in case of spam e mail detection, ‘this is spam’ and ‘this is not’ are two categories already defined in the training data.
2. Unsupervised learning/clustering: It is like plain partition where you need to put similar data items in one partition/cluster.
3. Semi-supervised learning: Training data consists of a few required results or category labels.
4. Reinforcement learning: Plunders from a series of actions. It is one of the most ambitious learning type.

Among st all this four types, Supervised learning is the most established, the most studied and the most widely used approach by machine learning experts. Obviously,Learning with supervision is pretty simple than learning without supervision.

Key Elements:
Thousands of machine learning algorithms are already present in the literature that are verified and validated. Furthermore, hundreds of innovative algorithms are developed in single year.

Each machine learning algorithm has three major components:

1. Representation: This component is responsible for performing computations and deriving ways to represent knowledge. Examples are decision trees, probabilistic methods, instances, graphical models, entropy based methods, neural networks, support vector machines, model ensembles etc.

2. Evaluation: This component handles ways to evaluate given programs/computation method (hypotheses). In other words, they are measures to evaluate performance done in first component. Examples include accuracy, Kappa statistics, silhouette score, precision and recall, squared error, likelihood, posterior probability, cost, margin, entropy k-L divergence, f-score etc.

3. Optimization: This component illustrates the way given programs are generated and are also known as the search process. Examples include combinatorial optimization, convex optimization, constrained optimization.
All machine learning algorithms are combinations of these three components and has a framework to fit this components.

This is how machine learning world is structured.

It includes statistics, predictive modelling and lots of other beautiful things.

To read about implementation of machine learning in python, please have a loot at scikit-learn library

Machine learning is integral part of data science, data analytics domain. Want to know more details about it? Click here

Please follow and like us:

2 thoughts on “Machine Learning Fundamentals

Leave a Reply

Your email address will not be published. Required fields are marked *

Enjoy this blog? Please spread the word :)

error: Content is protected !!