Show Menu
Cheatography

MVE137 - Chalmers University Cheat Sheet (DRAFT) by

MVE137 - Probability and statistical learning using Python

This is a draft cheat sheet. It is a work in progress and is not finished yet.

Basic Probab­ility Defini­tions

Sample Space (Ω)
Set of all possible outcomes of a random experi­ment.
Event
Outcome of a random experiment (inside Ω)
σ-field
The allowable events constitute a family of sets F, usually referred to as σ-field. Each set in F is a subset of the sample space Ω.
Probab­ility measure (P)
A probab­ility measure on (Ω, F) is a function P : F → [0, 1] that satisfies the following two proper­ties:
1. P[Ω] = 1
2. The probab­ility of the union of a collection of disjoint members is the sum of its probab­ilities
Probab­ility space
(Ω, F, P)
Basic properties of probab­ility measures
P[∅] = 0
P[A¯] = 1 − P[A]
If A ⊂ B, then P[B] = P[A] + P[B \ A] ≥ P[A]
P[A ∪ B] = P[A] + P[B] − P[A ∩ B]
Inclusion Exclusion Principle
(comes from last basic property of probab­ility measures)
Given sets A1, A2...
P[unio­n(Ai)] ≤ sum(P[Ai])
When the two events are disjoint, the inequality is = as they don't share any common space: P[A ∩ B] = 0
Sampling strategy
Choose repeatedly a random number in Ω
Sampling with replac­ement
Select random numbers in Ω, without taking into account which ones you've already tested. Therefore, there will be some numbers tested multiple times
Sampling without replac­ement
Select random numbers in Ω taking into account which ones you've already used. Therefore, you won't run the algorithm with the same number more than once
Indepe­ndent (events or family)
Two events are indepe­ndent if:
P[A ∩ B] = P[A] P[B]
It also applies to families {Ai, i∈ I}
Pairwise
To form all possible pairs (two items at a time) from a set
Pairwise indepe­ndent (family or events)
A family or events are pairwise indepe­ndent if:
P[Ai ∩ Aj ] = P[Ai] P[Aj] for all i != j
In english terms, a family or events is pairwise indepe­ndent if any of its possible pairs is indepe­ndent of each other. For example:
P(A∩B)­=P(­A)P(B)
P(A∩C)­=P(­A)P(C)
P(B∩C)­=P(­B)P(C)
Mutually indepe­ndent (events)
More than two events (i.e. A,B,C) are mutually indepe­ndent if:
1. They are pairwise indepe­ndent
2. They meet the condition:
P(A ∩ B ∩ C) = P(A) × P(B) × P(C)
In plain english, events are mutually indepe­ndent if any event is indepe­ndent to the other events
Condit­ional Probab­ility
If P[B] > 0, the condit­ional probab­ility that A occurs give that B occurs is: P[A|B]­=P[­A∩B­]/P[B]
Condit­ional Probab­ility (indep­endent events)
If A and B are indepe­ndent events, then:
P[A|B] = P[A∩ B]/P[B] =
(P[A]*P[B])/P[B] = P[A]
Law of Total Probab­ility
Let e1...en be partitions of Ω
(a collection of ALL the sets in Ω which are indepe­ndent of each other). Also assuming P[ei] > 0 for all i. The probab­ility of A can be written as:
P[A] = sum(i=1,n)(P[A|ei]*P[ei])
In english, it's the sum of all the possible scenarios in which A can occur
Bayes Theorem
Assuming e1...en be partitions of Ω:
P[ej|B] = P[Ej ∩ B]/P[B] = (P[B|Ej]P[Ej])­/(s­um(­i=1­,n)­(P[­B|ei]P[ei])
It's basically using condit­ional theory and then applying condit­ional theory again for the top part and law of total probab­ility in the lower part
 

Discrete Random Variables and Expect­ation

Random Variable
A random variable X on a sample space Ω is a real-v­alued (measu­rable) function on Ω; that is X : Ω → R.
Denoted as upper case in this course and real numbers as lower case
Discrete Random Variable
A discrete random variable is a random variable that outputs only a finite or countably infinite number of values
(i.e. number of kids in a family, range between 1 and x)
Probab­ility that X=a
Sum of all the events w in Ω which X(w) = x
Indepe­ndence of random variables
Two random variables X and Y are indepe­ndent if and only if:
P[(X = x)∩(Y = y)] = P[X=x]­*P[Y=y]
for all values x and y
Mutually indepe­ndent random variables
Like mutually indepe­ndent events
Expect­ation (mean)
It is a weighted average of the values assumed by the random variable, taking into account the probab­ility of getting that value.
The expect­ation of a discrete random variable X, denoted by E[X] is given by
E[X] = sum(i=­x,X­)(x*P[X = x])