Naive Bayes Classifier Machine Learning Algorithm

Naive Bayes Classifier
Spread the love

Naive Bayes Classifier is a supervised learning algorithm and user for solving classification problems. This algorithm is based on Bayes Theorem. It is one of the simple and effective classification algorithms that help in building fast machine learning models. Naive Bayes Classifier predicts on the basis of the probability of an object, so it is called a probabilistic classifier.

Naïve Bayes Classifier algorithm is comprised of two word “Naïve” and “Bayes” that is described below:

Naive: Naïve is used because it assumes all the features that are used in the modal are independent of each other. That means When changing a value of a feature, does not influence or change the value of other features.

Bayes: It depends on the ‘Bayes Theorem’ that’s why it is called Bayes.

Bayes Theorem

Bayes Theorem is a mathematical formula that calculates conditional probabilities. Bayes Theorem is also called as ‘Bayes rule’ or ‘Bayes law. Conditional Probability: The probability of one occurrence occurring in relation to one or more events is called Conditional Probability.
For Example:
● One event named A is raining outside, there is a 30% chance of it raining today.
● Another event named B is you will need to go outside, which has a 50% chance. These two events are in a relationship with each other so that is an example of Conditional Probability.

Read More: Linear regression Machine Learning Algorithm scikit-learn

Bayes Theorem: Formula:

P(A|B) = P(B|A) *P (A) / P(B)

P(A|B) is called as Probability of ‘A’ is given ‘B’
P(A|B) is Posterior Probability: Probability of hypothesis ‘A’ occurring given evidence ‘B’ has already occurred.[]
P(B|A) is Likelihood probability: Probability of hypothesis ‘B’ occurring given evidence ‘A’ has already occurred.
p (A) the independent probabilities of A
P(A) is Prior Probability: Probability of hypothesis ‘A’ before observing the evidence.
P(B) is Marginal Probability: Probability of Evidence.

Advantages of Naive Bayes Classifier

● Naive Bayes is the faster and easiest Machine Learning algorithm to predict an object of the dataset
● It can be used for Binary as well as Multiclass Classification.
● In Multi-class prediction, it performs very well as compared to the other Algorithms.
● One of the Popular choices for text/document classification problems.

Disadvantages of Naive Bayes Classifier

It cannot learn the relationship between features because it assumes all the features are
independent or unrelated.

Application of Naive Bayes Theorem

Document Classification: Suppose you have a document. You don’t know which topic is written on the document. You want to find the topic without reading the document. This problem can be solved with machine learning document classification using the “Naïve Bayes Classifier algorithm”.

Spam filtration: You can easily detect an Email, Facebook content, etc. spam or ham using the “Naïve Bayes Classifier algorithm”. You can see the project spam email detector here.
Real-time Prediction: Naive Bayes is an eager learning classifier and it is sure fast. Thus, it could be used for making predictions in real-time.
Multi-class Prediction: This algorithm is also well known for its multi-class prediction feature. Here we can predict the probability of multiple classes of the target variable.

Recommendation System: Naive Bayes Classifier and Collaborative Filtering together build a Recommendation System that uses machine learning and data mining techniques to filter unseen information and predict whether a user would like a given resource or not.

Spread the love

About Anisur Rahman Shahin

Hello. My name is Shahin. I'm a tech enthusiast guy. Personally, I’m an Optimistic and always in hurry kinda person. I'm a freelance web developer. I am working at Zakir Soft as Laravel Developer. My Portfolio website:

View all posts by Anisur Rahman Shahin →

Leave a Reply

Your email address will not be published.