Re: Assign a generic name to a function based on user decision

On Jun 3, 4:02 am, n...@xxxxxxxxx wrote:
In article <20090603005538.22093.31086....@xxxxxxxxxxxxxxxxxxxxxxxxx>,

Lurkos  <lurkos.use...@xxxxxxxxx> wrote:

Anyone who attempts to generate random numbers with a Gaussian
distribution by using ERF on ones with a uniform one lacks Clue.

That ain't how you do it ....

I think you are right, but this is the first time that I write a Monte
Carlo code and it is also the first time that I write a large program.
That's why I thought to generate a gaussian distribution using ERF: it
seemed the simplest way when I wrote the equations on the paper...

Ah!  Yes, it does.  It isn't.

The standard reference is Knuth "The Art of Computer Programming",
volume II, Seminumerical algorithms.  The best one I know of is
Devroye, but you (as a beginner) will find it overkill.  If you
can't get Knuth, look at:

Use the polar form.  Start with two U(-1,1) variates, X and Y,
and reject them (i.e. try again) if x^2+y^2 >= 1.  Then use the
formulae given.  You get two, pseudo independent, Gaussian
random numbers for the cost of generating two uniform ones,
one square root and one natural logarithm.

There are dozens of other methods (many unpublished - I have
invented several, as have most people in this area), but the
Box-Mueller one is the simplest and needs only very basic
elementary functions.

Nick Maclaren.

Other than a lack of computing efficiency, have you actually seen
problems with the SIN and COS form of the Box-Muller transform? I
remember two contemporary papers, one by Neave and one slightly later
by Chay ('73?) that discussed problems in the theoretical distribution
of the trig form, but in practice it really boiled down to using a LCG
with too small a multiplier.

[For the record I do use the non-trig polar form. It's easy to
remember, easy to code and it requires no extra tables.]

-- Elliot

P.S. Back when I had my first personal computer, I was overly
concerned with micro-efficiency. I spent quite a bit of time writing
and testing a 32 bit LCG that ran as quickly as possible on an 8 bit
processor with no hardware multiply. That did me very little good as
software floating point multiply and divide each ran at 1,000 floating
point operations per second. In the end I discovered that my program
spent most of its time doing things other than generating random