For me, in probability the event(data) is varies and the condition(distribution) is fixed, While in likelihood is the opposite event is fixed and condition is varies.

Someone gave you a bag of balls, and told you that there are three red and three black balls in it. Now, if they ask you: "What is the probability that a random ball drawn from the bag is red?", then the answer to this is pretty straightforward, right? It's 1/2.

Now imagine another situation.

The same guy gave you a bag of balls, but this time, didn't tell you how many red and black balls are there. Instead, he asked you another question: "What is the likelihood that this bag has equal number of red and black balls?".

Thus, you start by assuming that there are equal number of balls. So in contrast to the probability example above where you knew beforehand the individual probabilities of drawing a red and black ball, now you hypothesise that they drawing a red or a black ball from the bag is equally likely.

Next, say you draw 10 balls from the bag and the outcome was "RRRRRBRRR". Based on this result, you may be inclined to conclude that this bag does not have equal number of red and black balls. The way you quantify this is through "likelihood".

So essentially, likelihood determines how well some set of parameters can reproduce the data we already have.

So for our outcome: "RRRRRBRRR" and assumed paramters (p_red=0.5), the likelihood will be very low.

However, with the same outcome, p_red=0.8 (say) would have had more likelihood.

Very helpful. I have used a log Likelihood ratio to explain whether the known speaker in one recording was the same speaker in another recording. It was hard to tell someone that a 50% log likelihood ratio was better than flipping a coin given these recordings.

Hilarious :D. I can understand that and it's totally relatable. I remember being in a somewhat similar situation where I interpreted my result with LLR and ended up with confused faces :P

Great explanation ππ

For me, in probability the event(data) is varies and the condition(distribution) is fixed, While in likelihood is the opposite event is fixed and condition is varies.

Wow, thanks for sharing this, Mohammed :)

Loved reading your explanation.

Can you please explain likelihood with a live example.

Of course, Shikha :)

Imagine this.

Someone gave you a bag of balls, and told you that there are three red and three black balls in it. Now, if they ask you: "What is the probability that a random ball drawn from the bag is red?", then the answer to this is pretty straightforward, right? It's 1/2.

Now imagine another situation.

The same guy gave you a bag of balls, but this time, didn't tell you how many red and black balls are there. Instead, he asked you another question: "What is the likelihood that this bag has equal number of red and black balls?".

Thus, you start by assuming that there are equal number of balls. So in contrast to the probability example above where you knew beforehand the individual probabilities of drawing a red and black ball, now you hypothesise that they drawing a red or a black ball from the bag is equally likely.

Next, say you draw 10 balls from the bag and the outcome was "RRRRRBRRR". Based on this result, you may be inclined to conclude that this bag does not have equal number of red and black balls. The way you quantify this is through "likelihood".

So essentially, likelihood determines how well some set of parameters can reproduce the data we already have.

So for our outcome: "RRRRRBRRR" and assumed paramters (p_red=0.5), the likelihood will be very low.

However, with the same outcome, p_red=0.8 (say) would have had more likelihood.

Let me know if you need more help :)

Thank you Avi. Sorry for late reply.

Very helpful. I have used a log Likelihood ratio to explain whether the known speaker in one recording was the same speaker in another recording. It was hard to tell someone that a 50% log likelihood ratio was better than flipping a coin given these recordings.

Hilarious :D. I can understand that and it's totally relatable. I remember being in a somewhat similar situation where I interpreted my result with LLR and ended up with confused faces :P