Machine Learning : Naive Bayes Part 1

My Major area of  work is Text Analytic and Machine Learning. I always get excited to solve the problems in this area. So i thought i will share some of my knowledge on this also :).

We will start with Naive Bayes algorithm. It is a supervised learning, classification algorithm.

Supervised learning means, before running on actual data, we have to train this algorithm with some training set and explain it that which records are acceptable and which records are not acceptable. Eg. before trying my NB(Naive Bayes algo) on test data, I will  show some  examples to algo, that these how does a spam message look and how does a non-spam message look. once it is ready we can go ahead with trying it on real world data.

Before Trying NB, we need to know some basics about Probability. Lets go through that.
Lets assume. we have a dice. A dice have six faces, each face marks a distinct number between 1 to six.
If the dice is not biased, what is the probability that we will get a 6
                         Answer : 1/ 6
Because there are total 6 options and  only one option out of those six is correct for our case.

Now what is the probability for getting a even number when we roll the dice. Now again we have 6 options, but 3 options satisfy our condition. so
                             Answer : 3/6     =     1/2

well That was easy. So probability is all about dividing no of options which satisfy your critieria and total no of options.


probability of x   = no of options which satisfy x condition /total no of option.


No lets discuss conditional Probability

Let A and B be two events. If N is the total number of possible outcomes and NA and NB are the number of outcomes favorable to A and B, respectively, then the probabilities of A and B, denoted P[A] and P[B], respectively are:
P[A] = NA/N

and

P[B] = NB/N

These are the unconditional probabilities of A and B occurring(or Prior Probabilities of A and B).

Then the probability of A occurring, given that B has occurred, is the conditional probability of A, denoted P[A|B] 

The probability of B occurring, given that A has occurred is the conditional probability of B, denoted P[B|A] 

Bayes Theorem

Reverend Thomas Bayes(I was just talking to a friend, He told me that Bayes wanted to predict the probability of existence of God. Well i don't know how much true it is. But definitely interesting :) ) came up with a theorem which helps us calculate these conditional probabilities.
So lets see the Formula

               P[A|B] = P[B|A] P[A]  /   P[B]

Thats it. If we know P[B], P[B|A] and P[A], we can calculate P[A|B].

lets start with some examples. Suppose we have this data.

P(cancer) = 0.008
P(¬cancer) = 0.992
P(POS|cancer) = 0.98
P(POS|¬cancer) = 0.03
P(NEG|cancer) = 0.02
P(NEG|¬cancer) = 0.97

P(cancer) is probability of person having cancer
P(¬cancer) is probability of person not having cancer

P(POS|cancer) = 0.98 is probability that if person have cancer, 98 % times test will be positive
P(NEG|cancer) = 0.02  is probability that even if person have cancer, 2 % times test will be negative. (test gives wrong result)

P(POS|¬cancer) = 0.03 is probability that if person do not have cancer, 3% times test will be positive (test gives wrong result)
P(NEG|¬cancer) = 0.97  is probability that if person do not have cancer, 97% times test will be negative.

This is a very common example that is used in Probability to explain Naive Bayes. We are also following the same path.

Suppose Harry, goes to doctor A blood test for cancer is given and the test result is POS.  This is not looking good for Harry. After all, the test is 98% accurate. Using Bayes Theorem determine whether it is
more likely that Harry has cancer or does not.
Now You will think that 98% chances are that Harry has Cancer. Wrong, Because statement says test is 98% times right when all the samples given for test has cancer and in real world only 0.8% people have cancer. Now our conclusion should take these two things into consideration. we should find out that Given the test is positive, what is the probability that Harry has Cancer.

                                             we should calculate     P(cancer | POS)



Comments

  1. This comment has been removed by a blog administrator.

    ReplyDelete
  2. P[A|B] = P[B|A] P[A] / P[B]. Would await for a example ..

    ReplyDelete
  3. Be Certified in Machine Learning in real-time with AI Patasala's advanced Machine Learning training in Hyderabad.
    Machine Learning Training Hyderabad

    ReplyDelete

Post a Comment

Popular posts from this blog

Hive UDF Example

Custom UDF in Apache Spark

Hadoop series : Pseudo Distributed Mode Hadoop Instalation