# Computational Efficiency of Polar and Box Muller Method: Using Monte Carlo Application

Topics: Normal distribution, Probability theory, Cumulative distribution function Pages: 7 (1876 words) Published: July 18, 2012
CATEGORY: APPLIED MATHEMATICS

Computational Efficiency of Box-Muller and Polar Method
Using Monte-Carlo Application

by : Joy V. Lorin-Picar
Mathematics Department
Davao del Norte State College, New Visayas, Panabo City
picar_joy@yahoo.com

ABSTRACT
The efficiency of Mean Square Error (MSE) of the random normal variables generated from both the Marsaglia Polar Method and Box-Muller Method was examined for small and large n with Monte-Carlo application using MATHLAB. The empirical results showed that MSE of the random normal variables using the Marsaglia Polar Method approaches zero as n becomes larger.

Moreover, when run in MATHLAB, the Box-Muller method encountered some problems like: a) it runs slow in generating its MSE because of many calls to the math library; b) it has numerical stability problems when x1 is very close to zero; as a consequence of b, as n becomes large, there are serious problems if you are doing stochastic modeling and generating millions of numbers. Apparently, the Polar Method computes the MSE faster even when n is large, since it does the equivalent of the sine and cosine geometrically without a call to the trigonometric function library.

Keywords: Mean Square Error (MSE), Marsaglia Polar Method, Box-Muller Method, Monte-Carlo application

A. INTRODUCTION
The topic of generating Gaussian pseudo-random numbers given a source of uniform pseudo-random numbers comes up more frequently. There are many ways of solving this problem but this paper focuses through the Box-Muller and Marsaglia Polar Methods.

If we have an equation that describes our desired distribution function, then it is possible to use some mathematical manipulations based upon the fundamental transformation law of probabilities to obtain a transformation function for the distributions. This transformation takes random variables from one distribution as inputs and outputs random variables in a new distribution function. One of the most important transformation function is known as the Box-Muller transformation. It allows us to transform uniformly distributed random variables, to a new set of random variables with a Gaussian (or Normal) distribution. The other is the Marsaglia Polar Method which is a method of producing a pair of independent standard normal variates by radially projecting a random point on the unit circumference to a distance given by the square root of a chi-square-2 variate.

This paper looks into the efficiency of both methods in generating independent standard normal variates by evaluating its mean square error as n gets large. This makes use of the Monte-Carlo method using the MATHLAB software. B. Box- Muller Method

A Box–Muller transform (by George Edward Pelham Box and Mervin Edgar Muller 1958)[3] is a method of generating pairs of independent standard normally distributed (zero expectation, unit variance) random numbers, given a source of uniformly distributed random numbers.

This method , the Box–Muller transform was developed to be more computationally efficient.[4] It is commonly expressed in two forms. The basic form as given by Box and Muller takes two samples from the uniform distribution on the interval (0, 1] and maps them to two normally distributed samples. The basic form requires three multiplications, one logarithm, one square root, and one trigonometric function for each normal variate.[5] The most basic form of the transformation looks like:

is called the Box Muller transform, in which the chi variate is generated as

but that transform requires logarithm, square root, sine and cosine functions. Here, it starts with two independent random numbers, x1 and x2, which come from a uniform distribution (in the range from 0 to 1). Then apply the above transformations to get two new independent random numbers which have a Gaussian distribution with zero mean and a standard deviation of one.

Suppose U1 and U2 are independent random variables...