# Normal Distribution

Pages: 5 (1225 words) Published: May 27, 2014
﻿

Normal Distribution

Normal distribution is a statistics, which have been widely applied of all mathematical concepts, among large number of statisticians. Abraham de Moivre, an 18th century statistician and consultant to gamblers, noticed that as the number of events (N) increased, the distribution approached, forming a very smooth curve.

He insisted that a new discovery of a mathematical expression for this curve could lead to an easier way to find solutions to probabilities of, “60 or more heads out of 100 coin flips.” Along with this idea, Abraham de Moivre came up with a model that has a drawn curve through the midpoints on the top of each bar in a histogram of normally distributed data, which is called, “Normal Curve.”

One of the first applications of the normal distribution was used in astronomical observations, where they found errors of measurement. In the seventeenth century, Galileo concluded the outcomes, with relation to the measurement of distances from the star. He proposed that small errors are more likely to occur than large errors, random errors are symmetric to the final errors, and his observations usually gather around the true values. Galileo’s theory of the errors were discovered to be the characteristics of normal distribution and the formula for normal distribution, which was found by Adrian and Gauss, well applied with the errors. In 1778, Laplace, a mathematician and astronomer, discovered the same distribution. His “Central Limit Theorem” proved that even if the distribution is “roughly distributed”, the means of the repeated samples from the distribution is nearly normal, and the larger the size of the sample, the closer the distribution of means would be to a normal distribution. Quetelet, a statistician (astronomer, mathematician, and sociologist) was the earliest to use and apply the normal distribution to human characteristics such as weight, height, and strength. Normal distribution, also known as Gaussian distribution, is a function that represents the distribution of a set of data as a symmetrical bell shaped graph. The graph is also known as the “bell curve.” Normal curve is a drawn curve through the midpoints of the tops of each bar in a histogram of normally distributed data. Normal curve shows the shape of normally distributed histogram. The graph of the normal distribution depends on two factors, which are mean (µ) and standard deviation (which determines the height and width of a graph). Normal distribution of data is continuous for all values of x between -∞ and ∞ so that the intervals of a real number doesn’t have a probability of zero (-∞ ≤ x ≤ ∞). A probability density function:

where x is a normal random variable, μ is the mean, σ is the standard deviation, π is approximately 3.1416, and e is approximately 2.7183. The graph of the equation must be greater than or equal to zero for all possible values. The area under the curve always equals 1. The notation N (µ, σ^2) means normally distributed with the mean µ and variance σ^.

The standard normal distribution is the distribution that occurs when a normal random variable has a mean of zero and a standard deviation of one. The normal random variable of a standard normal distribution is called a standard score (z-score). The z-score represents the number of standard deviations that a data value is away from the mean. To convert from normal distribution to the standard normal form, we use z-score formula:

x = the value that is being standardized (normal random variable). μ = the mean of the distribution (mean of x).
σ= standard deviation of the distribution (standard deviation of x). In order to calculate the standard deviation, we have to find the variance first. Variance is the average of the squared differences from the mean. To calculate the variance, take the average of the...