# Bayes's Theorem
> [!summary] Bayes's Thorem
>
> $
> P(H\mid x) = \frac{P(x\mid H) P(H)}{P(x)}
> $
aka _Bayesian Inference_.
- $P(H \mid X)$ - _posterior probability_
- $P(X \mid H)$ - _likelihood_
- $P(H)$ - _prior probability_
- $P(X)$ - evidence
## Naive Bayes
- Assume the features are conditionally independent (hence "naïve"), to obtain
the likelihood, then apply Bayes' theorem.
- Advantages
- Easy to implement
- Good results in most of the cases
- Disadvantages
- Assumes there is at least one training object that has any feature value of
the test case. Otherwise, the predicted probability will be zero.
- Assumes conditional independence.