Skip to main content
added 594 characters in body
Source Link
aL_eX
  • 1.4k
  • 2
  • 15
  • 31

I'm trying to calculate the cumulative binomial probability of 'n' trials, with 'p' probability and 'r' as the successful outcome of each trial. I have written the following code that works sometimes, but not always:

Console.WriteLine (); Console.WriteLine ("B~(n, p)"); incorrectN: Console.WriteLine ("Enter value of 'n': "); int n = Convert.ToInt32 (Console.ReadLine ()); if (n < 0) { Console.WriteLine ("ERROR: 'n' must be greater than 0"); goto incorrectN; } incorrectP: Console.WriteLine (); Console.WriteLine ("Enter value of 'p': "); double p = Convert.ToDouble (Console.ReadLine ()); if (p > 1) { Console.WriteLine (); Console.WriteLine ("ERROR: 'p' must be between 0 and 1"); goto incorrectP; } Console.WriteLine (); incorrectS: int r = GetR(); int k = r; double binomTotal = 0; for (int j = r + 1; j > 0; j--) { int nCr = Factorial(n) / (Factorial(n - (r - k)) * Factorial(r - k)); binomTotal = binomTotal + nCr * Math.Pow(p, (r - k)) * Math.Pow(1 - p, (n - (r - k))); k--; } Console.WriteLine(); Console.WriteLine(binomTotal); 

P.S. I have written the GetR() and Factorial() functions elsewhere within the class, where GetR() asks the user for the value of 'r' and Factorial() is defined as follows:

public static int Factorial(int x) { return x <= 1 ? 1 : x * Factorial(x - 1); } 

I tested the code with values n = 10, p = 0.5 and r = 5 and the output is 0.623046875, which is correct. However, when I use n = 13, p = 0.35 and r = 7, I get 0.297403640622647 instead of 0.9538.

Any help would be much appreciated.

I'm trying to calculate the cumulative binomial probability of 'n' trials, with 'p' probability and 'r' as the successful outcome of each trial. I have written the following code that works sometimes, but not always:

int r = GetR(); int k = r; double binomTotal = 0; for (int j = r + 1; j > 0; j--) { int nCr = Factorial(n) / (Factorial(n - (r - k)) * Factorial(r - k)); binomTotal = binomTotal + nCr * Math.Pow(p, (r - k)) * Math.Pow(1 - p, (n - (r - k))); k--; } Console.WriteLine(); Console.WriteLine(binomTotal); 

P.S. I have written the GetR() and Factorial() functions elsewhere within the class, where GetR() asks the user for the value of 'r' and Factorial() is defined as follows:

public static int Factorial(int x) { return x <= 1 ? 1 : x * Factorial(x - 1); } 

I tested the code with values n = 10, p = 0.5 and r = 5 and the output is 0.623046875, which is correct. However, when I use n = 13, p = 0.35 and r = 7, I get 0.297403640622647 instead of 0.9538.

Any help would be much appreciated.

I'm trying to calculate the cumulative binomial probability of 'n' trials, with 'p' probability and 'r' as the successful outcome of each trial. I have written the following code that works sometimes, but not always:

Console.WriteLine (); Console.WriteLine ("B~(n, p)"); incorrectN: Console.WriteLine ("Enter value of 'n': "); int n = Convert.ToInt32 (Console.ReadLine ()); if (n < 0) { Console.WriteLine ("ERROR: 'n' must be greater than 0"); goto incorrectN; } incorrectP: Console.WriteLine (); Console.WriteLine ("Enter value of 'p': "); double p = Convert.ToDouble (Console.ReadLine ()); if (p > 1) { Console.WriteLine (); Console.WriteLine ("ERROR: 'p' must be between 0 and 1"); goto incorrectP; } Console.WriteLine (); incorrectS: int r = GetR(); int k = r; double binomTotal = 0; for (int j = r + 1; j > 0; j--) { int nCr = Factorial(n) / (Factorial(n - (r - k)) * Factorial(r - k)); binomTotal = binomTotal + nCr * Math.Pow(p, (r - k)) * Math.Pow(1 - p, (n - (r - k))); k--; } Console.WriteLine(); Console.WriteLine(binomTotal); 

P.S. I have written the GetR() and Factorial() functions elsewhere within the class, where GetR() asks the user for the value of 'r' and Factorial() is defined as follows:

public static int Factorial(int x) { return x <= 1 ? 1 : x * Factorial(x - 1); } 

I tested the code with values n = 10, p = 0.5 and r = 5 and the output is 0.623046875, which is correct. However, when I use n = 13, p = 0.35 and r = 7, I get 0.297403640622647 instead of 0.9538.

Any help would be much appreciated.

added 19 characters in body
Source Link
Magnus
  • 47.3k
  • 8
  • 88
  • 128

I'm trying to calculate the cumulative binomial probability of 'n' trials, with 'p' probability and 'r' as the successful outcome of each trial. I have written the following code that works sometimes, but not always:

int r = GetR(); int k = r; double binomTotal = 0; for (int j = r + 1; j > 0; j--) { int nCr = Factorial(n) / (Factorial(n - (r - k)) * Factorial(r - k)); binomTotal = binomTotal + nCr * Math.Pow(p, (r - k)) * Math.Pow(1 - p, (n - (r - k))); k--; } Console.WriteLine(); Console.WriteLine(binomTotal); 

P.S. I have written the GetR()GetR() and Factorial()Factorial() functions elsewhere within the class, where GetR()GetR() asks the user for the value of 'r' and Factorial()Factorial() is defined as follows:

public static int Factorial(int x)  {   return x <= 1 ? 1 : x * Factorial(x - 1);  } 

I tested the code with values n = 10, p = 0.5n = 10, p = 0.5 and r = 5r = 5 and the output is 0.6230468750.623046875, which is correct. However, when I use n = 13, p = 0.35n = 13, p = 0.35 and r = 7r = 7, I get 0.2974036406226470.297403640622647 instead of 0.95380.9538.

Any help would be much appreciated.

I'm trying to calculate the cumulative binomial probability of 'n' trials, with 'p' probability and 'r' as the successful outcome of each trial. I have written the following code that works sometimes, but not always:

int r = GetR(); int k = r; double binomTotal = 0; for (int j = r + 1; j > 0; j--) { int nCr = Factorial(n) / (Factorial(n - (r - k)) * Factorial(r - k)); binomTotal = binomTotal + nCr * Math.Pow(p, (r - k)) * Math.Pow(1 - p, (n - (r - k))); k--; } Console.WriteLine(); Console.WriteLine(binomTotal); 

P.S. I have written the GetR() and Factorial() functions elsewhere within the class, where GetR() asks the user for the value of 'r' and Factorial() is defined as follows:

public static int Factorial(int x)  {   return x <= 1 ? 1 : x * Factorial(x - 1);  } 

I tested the code with values n = 10, p = 0.5 and r = 5 and the output is 0.623046875, which is correct. However, when I use n = 13, p = 0.35 and r = 7, I get 0.297403640622647 instead of 0.9538.

Any help would be much appreciated.

I'm trying to calculate the cumulative binomial probability of 'n' trials, with 'p' probability and 'r' as the successful outcome of each trial. I have written the following code that works sometimes, but not always:

int r = GetR(); int k = r; double binomTotal = 0; for (int j = r + 1; j > 0; j--) { int nCr = Factorial(n) / (Factorial(n - (r - k)) * Factorial(r - k)); binomTotal = binomTotal + nCr * Math.Pow(p, (r - k)) * Math.Pow(1 - p, (n - (r - k))); k--; } Console.WriteLine(); Console.WriteLine(binomTotal); 

P.S. I have written the GetR() and Factorial() functions elsewhere within the class, where GetR() asks the user for the value of 'r' and Factorial() is defined as follows:

public static int Factorial(int x) { return x <= 1 ? 1 : x * Factorial(x - 1); } 

I tested the code with values n = 10, p = 0.5 and r = 5 and the output is 0.623046875, which is correct. However, when I use n = 13, p = 0.35 and r = 7, I get 0.297403640622647 instead of 0.9538.

Any help would be much appreciated.

deleted 9 characters in body
Source Link
DavidG
  • 119.8k
  • 13
  • 232
  • 239

I'm trying to calculate the cumulative binomial probability of 'n' trials, with 'p' probability and 'r' as the successful outcome of each trial. I have written the following code that works sometimes, but not always:

 int r = GetR ();  int k = r;  double binomTotal = 0;  for (int j = r + 1; j > 0; j--) {     int nCr = Factorial (n) / (Factorial (n - (r - k)) * Factorial (r - k));   binomTotal = binomTotal + nCr * Math.Pow(p, (r - k)) * Math.Pow(1 - p, (n - (r - k)));   k--;  }  Console.WriteLine ();  Console.WriteLine (binomTotal); 

P.S. I have written the GetR() and Factorial() functions elsewhere within the class, where GetR() asks the user for the value of 'r' and Factorial() is defined as follows:

public static int Factorial(int x) { return x <= 1 ? 1 : x * Factorial(x - 1); } 

I tested the code with values n = 10, p = 0.5 and r = 5 and the output is 0.623046875, which is correct. However, when I use n = 13, p = 0.35 and r = 7, I get 0.297403640622647 instead of 0.9538.

Any help would be much appreciated.

I'm trying to calculate the cumulative binomial probability of 'n' trials, with 'p' probability and 'r' as the successful outcome of each trial. I have written the following code that works sometimes, but not always:

 int r = GetR ();  int k = r;  double binomTotal = 0;  for (int j = r + 1; j > 0; j--) {     int nCr = Factorial (n) / (Factorial (n - (r - k)) * Factorial (r - k));   binomTotal = binomTotal + nCr * Math.Pow(p, (r - k)) * Math.Pow(1 - p, (n - (r - k)));   k--;  }  Console.WriteLine ();  Console.WriteLine (binomTotal); 

P.S. I have written the GetR() and Factorial() functions elsewhere within the class, where GetR() asks the user for the value of 'r' and Factorial() is defined as follows:

public static int Factorial(int x) { return x <= 1 ? 1 : x * Factorial(x - 1); } 

I tested the code with values n = 10, p = 0.5 and r = 5 and the output is 0.623046875, which is correct. However, when I use n = 13, p = 0.35 and r = 7, I get 0.297403640622647 instead of 0.9538.

Any help would be much appreciated.

I'm trying to calculate the cumulative binomial probability of 'n' trials, with 'p' probability and 'r' as the successful outcome of each trial. I have written the following code that works sometimes, but not always:

int r = GetR(); int k = r; double binomTotal = 0; for (int j = r + 1; j > 0; j--) { int nCr = Factorial(n) / (Factorial(n - (r - k)) * Factorial(r - k)); binomTotal = binomTotal + nCr * Math.Pow(p, (r - k)) * Math.Pow(1 - p, (n - (r - k))); k--; } Console.WriteLine(); Console.WriteLine(binomTotal); 

P.S. I have written the GetR() and Factorial() functions elsewhere within the class, where GetR() asks the user for the value of 'r' and Factorial() is defined as follows:

public static int Factorial(int x) { return x <= 1 ? 1 : x * Factorial(x - 1); } 

I tested the code with values n = 10, p = 0.5 and r = 5 and the output is 0.623046875, which is correct. However, when I use n = 13, p = 0.35 and r = 7, I get 0.297403640622647 instead of 0.9538.

Any help would be much appreciated.

Source Link
aL_eX
  • 1.4k
  • 2
  • 15
  • 31
Loading