![]() |
The Law of Large Numbers (LLN) is a mathematical theorem in probability and statistics that helps us understand how averages work over time. It says that if you repeat an experiment many times, the average result will get closer and closer to the expected value. For example, if you flip a fair coin many times, the proportion of heads and tails will get closer to 50% each. This principle is important because it shows that with enough trials, we can predict the average outcome of random events more accurately. In this article, we’ll explain what the Law of Large Numbers is, its limitations, and why it’s useful, using simple examples and explanations. In this article, we have discussed the Law of Large Numbers definition, its limitations, examples and others in details. Table of Content What is Law of Large Numbers?
Example: If you flip a fair coin many times, the proportion of heads will get closer to 50% as you increase the number of flips. ![]() Law of Large Numbers Types of Law of Large NumbersVarious of the Law of Large Numbers are: 1. Weak Law of Large Numbers (WLLN)Imagine you have a fair coin and flip it several times to determine Weak Law of Large Numbers (WLLN). The Weak Law of Large Numbers indicates that the proportion of heads (or tails) seen will get closer and closer to 0.5 (since the coin is fair, and the probability of obtaining a head or a tail is 0.5). As you increase the number of coin flips. Generally speaking, the WLLN states that as the number of observations rises the sample average will converge to the theoretical mean if your series of independent and identically distributed random variables (such as coin flips). 2. Strong Law of Large Numbers (SLLN)An improved form of the WLLN is the Strong Law of Large Numbers. It says the sample average not only will converge to the expected value but also offers a gauge of the speed at which this convergence occurs. Returning to the coin flipping example, the SLLN indicates that as the number of coin flips rises the likelihood of the sample average deviating from the expected value (0.5) by a notable amount falls. Stated differently, when more observations are gathered, the likelihood of the sample average deviating significantly from the expected value becomes even more rare. Both versions essentially mean that with enough trials, the results will stabilize around the expected average. Limitation of Law of Large NumbersVarious limitations of the Law of Large Numbers are:
Law of Iterated Logarithm (LIL)An evolved variation of the Law of Large Numbers is the Law of the Iterated Logarithm. It offers for the sample average an exact rate of convergence. It says the sample average will fluctuate about the predicted value, but as the number of observations rises the oscillations will get ever less. In the coin flipping example, the LIL indicates that, as the number of coin flips rises, the sample average will not only converge to 0.5 but also offers a specific range within where it is most likely to fall. Why is Law of Large Numbers Important?Law of large number is important because of following traits:
Law of Large Numbers (LLN) and Central Limit Theorem (CLT)Law of Large Numbers (LLN) and the Central Limit Theorem (CLT) are two fundamental concepts in probability and statistics that describe the behavior of large samples and their definition is: Law of Large Numbers (LLN)Law of Large Numbers states that as the number of trials or observations increases, the average of the results obtained will converge to the expected value. Central Limit Theorem (CLT)Central Limit Theorem states that the distribution of the sample mean of a sufficiently large number of independent, identically distributed (i.i.d.) random variables approaches a normal distribution, regardless of the original distribution of the variables.
Law of Large Numbers and the Central Limit Theorem are foundational principles in probability and statistics. LLN ensures that averages of large samples are reliable estimates of the population mean, while CLT justifies the use of the normal distribution for making inferences about sample means. Examples of Law of Large NumbersAn example explaining law of large numbersis added below: Imagine your bag contains blue and red balls. Assume the bag holds 50% blue balls and 50% red balls. Drawing just one ball from the bag might result in a red or a blue ball, but it would be difficult to forecast the precise hue. Now imagine, you take 10 balls from the bag one at a time, noting the colors. You might get six red balls and four blue balls or perhaps seven red balls and three blue balls. Although your little sample’s red to blue ball ratio would not be exactly 50:50, it would most certainly be near. Law of Large Numbers, however, informs us that the ratio of red balls to blue balls in your total sample would get closer and closer to the theoretical ratio of 50:50 if you kept drawing balls from the bag and tracking the colors hundreds or perhaps thousands of times. Law of Large Numbers in FinanceLaw of Large Numbers is a fundamental concept in probability theory and statistics that has significant applications in finance. In simple terms, it states that as the sample size (or number of observations) increases, the average of the results observed will become closer and closer to the expected or theoretical average. Let’s break this down with an example: Imagine you are flipping a fair coin. The theoretical probability of getting heads is 0.5 (or 50%). If you flip the coin only a few times, say 10 times, the observed proportion of heads may deviate significantly from 0.5 due to random chance. However, if you flip the coin thousands or millions of times, the observed proportion of heads will be very close to 0.5. In finance, the Law of Large Numbers is particularly relevant in the context of portfolio management and risk analysis. Here are a few examples:
Articles Related to Law of Large Numbers:Law of Large Numbers in ExamplesExample 1: A fair six-sided die is rolled repeatedly. What is the expected average value of the outcomes as the number of rolls increases Solution:
Example 2: In a game, a player flips a fair coin. If it lands heads, the player wins $1; if it lands tails, the player loses $1. What is the expected average profit/loss for the player as the number of flips increases? Solution:
Example 3: A bag contains 20 red balls and 30 blue balls. A ball is drawn from the bag, and the color is noted. The ball is then returned to the bag, and the process is repeated. What is the expected proportion of red balls drawn as the number of draws increases? Solution:
Example 4: A factory produces light bulbs, and historical data show that 5% of the bulbs are defective. If a random sample of bulbs is taken from the production line, what is the expected proportion of defective bulbs as the sample size increases? Solution:
Example 5: In a game, a player rolls two fair six-sided dice and wins $10 if the sum of the dice is 7, and loses $5 otherwise. What is the expected average profit/loss for the player as the number of rolls increases? Solution:
Example 6: A student takes multiple-choice quizzes with 5 questions, each with 4 answer choices. If the student randomly guesses the answers to all questions, what is the expected average score as the number of quizzes increases? Solution:
FAQs on Law of Large NumbersWhat is the law of large numbers?
What is the law of large numbers example?
What is the weak law of large numbers statement?
What is limit theorem law of large numbers?
|
Reffered: https://www.geeksforgeeks.org
Mathematics |
Type: | Geek |
Category: | Coding |
Sub Category: | Tutorial |
Uploaded by: | Admin |
Views: | 16 |