How to make a precalculated lookup table for normal CDF ?

  • Thread starter Thread starter Naucle
  • Start date Start date
Joined
3/6/15
Messages
7
Points
11
Hello,

I have a program that runs the normal CDF function up to 240 000 000 times, which makes it a little bit slow, and I have been adivised to precalculat a lookup table and to interpolate from it, but I'm just a novice in C++ and I have no idea how to do that (google couldn't help me neither). Do you people have any useful links ?

Thank you.
 
You can just use a straight c++ map.

(Pseudocode)
map<float, float> n_map;

min = -10;
max = 10;
granularity = 0.05;

for (int x = min ; x < max; x += granularity) {
n_map[x] = normal_cdf(x);
}

If you want to avoid using a map & memory for it you will probably need something cuter
 
Hello,

I have a program that runs the normal CDF function up to 240 000 000 times, which makes it a little bit slow, and I have been adivised to precalculat a lookup table and to interpolate from it, but I'm just a novice in C++ and I have no idea how to do that (google couldn't help me neither). Do you people have any useful links ?
Thank you.
I do not know what is
a precalculated lookup table for normal CDF
could you please point a link that would describe what you are talking about.
thanks !
 
You can just use a straight c++ map.

(Pseudocode)
map<float, float> n_map;

min = -10;
max = 10;
granularity = 0.05;

for (int x = min ; x < max; x += granularity) {
n_map[x] = normal_cdf(x);
}

If you want to avoid using a map & memory for it you will probably need something cuter

That's what I did actually, but it's still tricky to interpolate and the precision is not that good.
 
Well, here's a few other options:

1) Find a really fast implementation of the CDF function.
2) Create your own CDF function with memoization / approximation, etc (so if tolerance is within 0.001 for example, do not recalculate)

You will need some notion of tolerance either way unless you plan on recalculating every time...
 
Back
Top Bottom