I have a problem in showing the decimals on the average. It keeps showing .00 or .1. I tried to put it in double, but I still get the same results.Also, I want to include the first integer that I input to the sum, but I have no idea how.Please help:
import java.io.*;
import java.util.*;
public class WhileSentinelSum
{
static final int SENTINEL= -999;
public static void main(String[] args)
{
// Keyboard Initialization
Scanner kbin = new Scanner(System.in);
//Variable Initialization and Declaration
int number, counter;
int sum =0;
int average;
//Input
System.out.print("Enter an integer number: " );
number = kbin.nextInt();
counter=0;
//Processing
//Output
while(number != SENTINEL)
{
counter++;
System.out.print("Enter an integer number: ");
number = kbin.nextInt();
sum += number;
}
if (counter !=0)
System.out.println("\nCounter is: " + counter);
else
System.out.println("no input");
average= sum/counter;
System.out.println("The sum is " + sum);
System.out.println("The average is " + average);
System.out.println();
}//end main
}//end class
Even if you declared average
to be double
, the sum
and counter
variables are int
, so Java still uses integer division before assigning it to average
, e.g. 5 / 10
results in 0
instead of 0.5
.
Cast one of the variables to double
before the division to force a floating point operation:
double average;
...
average = (double) sum / counter;
You can include the first number in your calculation by processing it before you go into the while loop (if it's not the sentinel value).