I'm trying to calculate the cumulative binomial probability of 'n' trials, with 'p' probability and 'r' as the successful outcome of each trial. I have written the following code that works sometimes, but not always:
Console.WriteLine (); Console.WriteLine ("B~(n, p)"); incorrectN: Console.WriteLine ("Enter value of 'n': "); int n = Convert.ToInt32 (Console.ReadLine ()); if (n < 0) { Console.WriteLine ("ERROR: 'n' must be greater than 0"); goto incorrectN; } incorrectP: Console.WriteLine (); Console.WriteLine ("Enter value of 'p': "); double p = Convert.ToDouble (Console.ReadLine ()); if (p > 1) { Console.WriteLine (); Console.WriteLine ("ERROR: 'p' must be between 0 and 1"); goto incorrectP; } Console.WriteLine (); incorrectS: int r = GetR(); int k = r; double binomTotal = 0; for (int j = r + 1; j > 0; j--) { int nCr = Factorial(n) / (Factorial(n - (r - k)) * Factorial(r - k)); binomTotal = binomTotal + nCr * Math.Pow(p, (r - k)) * Math.Pow(1 - p, (n - (r - k))); k--; } Console.WriteLine(); Console.WriteLine(binomTotal); P.S. I have written the GetR() and Factorial() functions elsewhere within the class, where GetR() asks the user for the value of 'r' and Factorial() is defined as follows:
public static int Factorial(int x) { return x <= 1 ? 1 : x * Factorial(x - 1); } I tested the code with values n = 10, p = 0.5 and r = 5 and the output is 0.623046875, which is correct. However, when I use n = 13, p = 0.35 and r = 7, I get 0.297403640622647 instead of 0.9538.
Any help would be much appreciated.