Horje
Joint Probability Mass Function

Joint Probability Mass Function (PMF) is a fundamental concept in probability theory and statistics, used to describe the likelihood of two discrete random variables occurring simultaneously. It provides a way to calculate the probability of multiple events occurring together.

Joint Probability Mass Function

A Joint Probability Mass Function, denoted as P(X = x, Y = y) or f(x, y), is a function that gives the probability that discrete random variables X and Y simultaneously take on specific values x and y, respectively.

Characteristics of Joint Probability Mass Function

  • Domain: Function is defined for all possible combinations of x and y in the X and Y sample space.
  • Range: 0 ≤ P(X = x, Y = y) ≤ 1 for all x and y.
  • Sum: Sum of the joint PMF over all possible values of x and y must equal 1.
  • Non-Negativity: P(X = x, Y = y) ≥ 0 for all x and y.
  • Marginal Distributions: Joint PMF can be used to find the marginal PMFs of X or Y: P(X = x) = ∑P(X = x, Y = y) y P(Y = y) = ∑P(X = x, Y = y) x
  • Conditional Probability: Joint PMF can be used to calculate conditional probabilities: P(Y = y | X = x) = P(X = x, Y = y) / P(X = x)
  • Independence: Random variables X and Y are independent if and only if: P(X = x, Y = y) = P(X = x) × P(Y = y) for all x and y

What is Difference between PMF and PDF?

Characteristic Probability Mass Function (PMF) Probability Density Function (PDF)
Type of Random Variable Discrete Continuous
Definition A function that gives the probability that a discrete random variable is exactly equal to some value. A function that describes the relative likelihood for this random variable to take on a given value.
Range of Values PMF can only be defined for specific discrete values. PDF is defined over a continuous range of values.
Probability Calculation P(X = x) = f(x), where f(x) is the PMF P(a ≤ X ≤ b) = ∫[a to b] f(x)dx, where f(x) is the PDF
Properties Σf(x) = 1 for all possible x, and 0 ≤ f(x) ≤ 1 ∫[all x] f(x)dx = 1, and f(x) ≥ 0 for all x
Graphical Representation Represented by discrete points or bars in a histogram. Represented by a continuous curve.
Example Binomial distribution or Poisson distribution. Normal distribution or Exponential distribution.
Value at a Point Gives the actual probability. Does not give the actual probability, but rather a density.
Units Unitless (pure probability). Has units of probability per unit of measurement.
Cumulative Distribution Summing the PMF values up to a point. Integrating the PDF up to a point.
Expectation Calculation E[X] = Σx × f(x) E[X] = ∫x × f(x)dx
Interpretation Direct probability interpretation. Area under the curve gives probability.

PMF of Binomial Distribution

Binomial Distribution is a discrete probability distribution that models the number of successes in a fixed number of independent Bernoulli trials. Here’s a detailed explanation of its PMF:

Definition: A random variable X follows a Binomial Distribution with parameters n and p, denoted as X ~ B(n, p), if and only if:

  • There are a fixed number n of independent trials
  • Each trial has only two possible outcomes: success (with probability p) or failure (with probability 1-p)
  • Probability of success p remains constant for all trials

Probability Mass Function: For a random variable X that follows a Binomial Distribution B(n, p), the PMF is given by: 

P(X = k) = C(n,k)×pk×(1-p)(n-k)

where:

  • k is Number of Successes (0 ≤ k ≤ n)
  • n is Total Number of Trials
  • p is Probability of Success on an Individual Trial
  • C(n,k) is Binomial Coefficient, also written as (n choose k)

Binomial Coefficient: Binomial coefficient C(n,k) represents the number of ways to choose k items from a set of n items, without replacement and regardless of order. It is calculated as: 

C(n,k) = n! / {k! × (n-k)!}

where, “!” denotes the factorial operation.

Properties:

  • Mean (expected value) of X is E(X) = np
  • Variance of X is Var(X) = np(1-p)
  • PMF is symmetric if and only if p = 0.5

Example: Consider a fair coin tossed 5 times. Let X be the number of heads obtained. Then: X ~ B(5, 0.5) To find P(X = 3), that is, the probability of getting exactly 3 heads, we use the

PMF: P(X = 3) = C(5,3) × (0.5)3 × (0.5)(5-3)

= (5! / (3! × 2!)) × (0.5)3 × (0.5)2

= 10 × 0.125 × 0.25

= 0.3125

Applications of PMF of Binomial Distribution

  • Binomial Distribution can model various real-world scenarios, such as:
  • Number of defective items in a batch of products
  • Number of successful sales calls made by a salesperson
  • Number of patients that respond to a treatment in a clinical trial

PMF of Poisson Distribution

Poisson distribution is a discrete probability distribution that expresses the probability of a given number of events occurring in a fixed interval of time or space, assuming these events occur with a known constant mean rate and independently of the time since the last event.

PMF of Poisson Distribution

Let X be a random variable that follows a Poisson distribution with parameter λ (lambda). The PMF of X is given by:

P(X = k) = {e(-λ) × λk} / k!

where:

  • e is Base of Natural Logarithms (approximately 2.71828)
  • λ is Average Number of Events in an Interval
  • k is Number of Events (k = 0, 1, 2, …)
  • k! Denotes Factorial of k

Properties and Characteristics

  • PMF gives the probability that an event occurs exactly k times in an interval.
  • λ represents both the mean and the variance of the distribution, such that: E(X) = Var(X) = λ
  • Poisson distribution is often used to model rare events, and it can be applied in various fields such as physics, biology, or business.
  • As λ increases, the Poisson distribution becomes more symmetric and approaches a normal distribution.
  • Sum of two or more independent Poisson random variables is also a Poisson random variable, with a mean that is the sum of the individual means.

Applications of PMF of Poisson Distribution

Poisson distribution can be used to model various phenomena, such as:

  • Number of customers arriving at a store in an hour
  • Number of defects in a manufactured product
  • Number of phone calls received by a call center in a day
  • Number of radioactive particle emissions detected in a fixed time interval

Relationship with Other Distributions:

  • Poisson distribution is closely related to the exponential distribution, which models the time between Poisson events.
  • As λ becomes large, the Poisson distribution can be approximated by a normal distribution with mean λ and variance λ.
  • Poisson distribution can be derived as a limiting case of the binomial distribution under certain conditions.

Example: Suppose we want to find the probability of exactly 3 events occurring in an interval where the average number of events is 2 (λ = 2).

P(X = 3) = (e(-2) × 23)/3!

= (0.1353 × 8) / 6

≈ 0.1804

Therefore, the probability of exactly 3 events occurring is approximately 0.1804 or 18.04%.

Applications of Probability Mass Functions

Various application of Probability Mass Functions is:

  • Discrete Random Variables: PMFs are used to describe the probability distribution of discrete random variables. For example, an PMF can model the number of heads that occur in a series of coin flips or the number of defective items in a batch of products.
  • Statistical Inference: PMFs play a crucial role in statistical inference. They are used to calculate probabilities and expectations that are essential for hypothesis testing and parameter estimation.
  • Bayesian Statistics: In Bayesian statistics, PMFs are used to represent prior and posterior distributions of discrete random variables. This application is particularly useful in fields such as machine learning and data science.
  • Queueing Theory: PMFs are applied in queueing theory to model the number of customers in a system or the time between arrivals. This has applications in operations research and computer network modeling.
  • Reliability Engineering: In reliability engineering, PMFs can model the number of failures that occur in a system over time. This is useful for predicting maintenance needs and system longevity.
  • Genetics and Biology: PMFs are used to model genetic inheritance patterns or the distribution of species in an ecosystem. For instance, they can describe the probability of inheriting certain traits or the likelihood of observing a specific number of organisms in a habitat.
  • Finance and Economics: In finance, PMFs can model discrete investment returns or the number of trades in a given time period. They are also used in economics to model things like household sizes or income distributions.
  • Information Theory: PMFs are fundamental in information theory, where they are used to calculate entropy and mutual information. These concepts are crucial in data compression and communication systems.
  • Game Theory: In game theory, PMFs can represent the distribution of strategies that players might choose. This application is relevant in economics, political science, and artificial intelligence.
  • Quality Control: PMFs are used in quality control to model the number of defects in a product or process. This helps in setting quality standards and implementing control measures.
  • Actuarial Science: In insurance and actuarial science, PMFs model discrete events such as the number of claims filed in a given period or the number of accidents that occur.

Related Articles:

Examples on Joint Probability Mass Function

Example 1: Consider an experiment where a fair six-sided die is rolled. The PMF table for this experiment is:

x (outcome) 1 2 3 4 5 6
P(X = x) 1/6 1/6 1/6 1/6 1/6 1/6

Find the probability of rolling an even number.

Solution:

P(even) = P(X = 2) + P(X = 4) + P(X = 6)

= 1/6 + 1/6 + 1/6

= 3/6 = 1/2

Example 2: A bag contains 3 red, 4 blue, and 5 green marbles. The PMF table for randomly drawing a marble is:

x (color) Red Blue Green
P(X = x) 1/4 1/3 5/12

Calculate the probability of drawing a red or blue marble.

Solution:

P(red or blue) = P(X = red) + P(X = blue)

= 1/4 + 1/3 = 3/12 + 4/12

= 7/12

Example 3: Consider a discrete random variable X with the following PMF:

x 0 1 2 3
P(X = x) 0.1 0.3 0.4 0.2

Find P(X ≤ 2) and the expected value E(X).

Solution:

P(X ≤ 2) = P(X = 0) + P(X = 1) + P(X = 2)

= 0.1 + 0.3 + 0.4

= 0.8

E(X) = 0 × 0.1 + 1 × 0.3 + 2 × 0.4 + 3 × 0.2

= 0 + 0.3 + 0.8 + 0.6

= 1.7

Example 4: An experiment involves flipping a fair coin twice. The PMF table for the number of heads is:

x (number of heads) 0 1 2
P(X = x) 1/4 1/2 1/4

Calculate the probability of getting at least one head and the variance of X.

Solution:

P(at least one head) = P(X = 1) + P(X = 2)

= 1/2 + 1/4 = 3/4

E(X) = 0 × 1/4 + 1 × 1/2 + 2 × 1/4 = 1

E(X²) = 0² × 1/4 + 1² × 1/2 + 2² × 1/4 = 1.5

Var(X) = E(X²) – [E(X)]²

= 1.5 – 1² = 0.5

Example 5: Consider an experiment where two fair six-sided dice are rolled, and the sum of the numbers is recorded. The PMF table for this experiment is:

x (sum) 2 3 4 5 6 7 8 9 10 11 12
P(X = x) 1/36 2/36 3/36 4/36 5/36 6/36 5/36 4/36 3/36 2/36 1/36

Find the probability that the sum is greater than 9 or equal to 7.

Solution:

P(X > 9 or X = 7) = P(X = 10) + P(X = 11) + P(X = 12) + P(X = 7)

= 3/36 + 2/36 + 1/36 + 6/36

= 12/36

= 1/3

Conclusion

Understanding Joint PMFs is crucial for analyzing discrete multivariate probability distributions. They provide a powerful tool for modeling complex systems and making predictions in fields ranging from statistics and engineering to economics and biology.

FAQs on Joint Probability Mass Function

What is the main difference between a PMF and a PDF?

  • PMFs are for discrete random variables.
  • PDFs are for continuous random variables.

How do you calculate marginal distributions from a joint PMF?

To calculate marginal distributions from a joint PMF, sum the joint PMF over all values of the other variable.

What condition must be met for two random variables to be independent?

For two random variables to be independentheir joint PMF must equal the product of their individual PMFs for all values.

What are some common applications of PMFs?

Some common applications of PMFs includes: Statistical Inference, Reliability Engineering, Finance, and Quality Control.

Poisson distribution can be derived as a limiting case of the binomial distribution under certain conditions.




Reffered: https://www.geeksforgeeks.org


Mathematics

Related
How to Teach Data Handling to Kids How to Teach Data Handling to Kids
How to Teach Multiplication Tables to Kids How to Teach Multiplication Tables to Kids
Calculus Formulas Calculus Formulas
Integration by Substitution Practice Problems Integration by Substitution Practice Problems
Chi-square with Ordinal Data Chi-square with Ordinal Data

Type:
Geek
Category:
Coding
Sub Category:
Tutorial
Uploaded by:
Admin
Views:
26