Probability Theory¶
Probability theory is the mathematical language for reasoning about uncertainty and chance. It provides the foundation for statistical inference and underlies many machine learning algorithms.
This section covers:
- Basic Concepts: Core definitions like experiments, outcomes, sample spaces, events, and the axioms of probability. Includes fundamental counting techniques (Combinatorics), understanding Conditional Probability and Independence, and the crucial Bayes' Theorem.
- Random Variables: Mapping outcomes to numbers (discrete vs. continuous), described by PMFs (discrete) or PDFs (continuous), and the unifying CDF. Also covers Joint, Marginal, and Conditional Distributions for multiple variables and theoretical tools like Moment Generating Functions.
- Common Distributions: Detailed notes on essential distributions used in modeling, including Bernoulli, Binomial, Poisson, Geometric, Negative Binomial, Uniform, Exponential, Gamma, Normal (Gaussian), and Beta.
- Expectation, Variance, Covariance: Key measures summarizing distributions, including Expected Value (mean), Variance/Standard Deviation (spread), and Covariance/Correlation (relationships).
- Important Theorems: Foundational results like the Law of Large Numbers (LLN) and the Central Limit Theorem (CLT).
Use the sidebar navigation to explore specific topics within Probability Theory.