## Probability and likelihood learned from On-Base Percentage

The previous post was dealt with likelihood . Let’s take a look at some interesting examples introduced in “Major League Baseball statistics”. (Book in Korean) The R markdown can be found in Github  and this post is quoted partially from chapter 4 in the book mentioned above.   1. The probability that goes on the base twice …

## Likelihood

likelihood ? It’s very confusing concept to me, very difficult to understand at a glance. I’ve looked for the definition from Wikipedia and sought some blogs which explained about it. First of all I’ve referred sw4r  ‘s blog which contained huge amount of numerical statistics. And I found the definition of ‘Likelihood’ in Wikipedia which was really …

## Independence of Events and Conditional Probability

We go over the definitions of independence of events and conditional probability. In addition to that we gonna deal with sampling with / without replacement.   1. Independence of events The necessary and sufficient condition of that two events are independent each other is, P(A ∩ B) = P(A) * P(B) ⇔ P(B ∩ A) = …

## The reason why red ball example is not suitable for explaining independence in events

There were some miss writings in this blog. I am very sorry about that. I have corrected my mistakes and please let me know if you will find something wrong in later posts. In the previous post I’ve explained independent trials with two red balls in sampling. If we assume that there are seven black balls …

## Terms in Probabilistic Theory

I was a bit confused about some terms such like Events and Outcomes, Trials. I’ve gathered Wikipedia Results into here. Now I am clear in somewhat.   1. Probability : Probability is the measure of the likelihood that an event will occur. See glossary of probability and statistics. Probability quantifies as a number between 0 and 1, where, loosely speaking, 0 indicates impossibility and 1 indicates …

## Conditional Probability and the Independence of Trial

Conditional Probability is defined as below. It implies that probability space should be restricted in A to get  . It can be shown as like a picture below. Let’s consider the independence of events. If we assume that there are 10 balls in a box. (# of black balls  = 7, # of red balls …

## Joint Probability and Chain Rule

1. Joint Probability Function Joint probability can be classified with discrete random variables and continuous random variables. In this post I want deal with just discrete random variables and its joint probability function. Let is given by   The function   will be referred to as the joint probability function. [referred by Mathematical Statistics with …

## Conditional Independence

Let’s consider conditional independence. I would recommend an example below which is the one of the best example to explain conditional independence intuitively. I’ve heard an on-line machine learning lecture from Prof. Il-Chul Moon who is the professor at KAIST. I would like to quote an picture from that lecture (3.2) There is an commander who …