What is the difference between #.## and ##.## pattern in Decimal Format?

Gokul Nath KP picture Gokul Nath KP · Apr 24, 2014 · Viewed 13.5k times · Source

In my program I'm using the pattern #.## in DecimalFormat as shown below:

DecimalFormat df = new DecimalFormat("#.##"); 

By mistake I added an extra # as shown below:

DecimalFormat df = new DecimalFormat("##.##");

But this does not affects my output. I've tried using different combinations of inputs. There is no difference in the output. Tried Googling, but no proper explanation.

So what is the exact difference of using "#.##" and "##.##"?

If both are same, why is it allowed in Java?

If both are different, why output is same in both cases?

EDIT:

Sample program:

import java.text.DecimalFormat;

public class Decimals {

    public static void main(String[] args) {

        double d1[] = new double[] {100d, -1d, -0.111111d, 2.555666d, 55555555555d, 0d};

        DecimalFormat df1 = new DecimalFormat("#.##");
        DecimalFormat df2 = new DecimalFormat("##.##");

        for (double d : d1) {
            System.out.println(df1.format(d));
            System.out.println(df2.format(d));
        }
    }

}

Output:

100
100
-1
-1
-0.11
-0.11
2.56
2.56
55555555555
55555555555
0
0

Answer

Tristan Burnside picture Tristan Burnside · Apr 24, 2014

According to DecimalFormat JavaDoc. "#, Decimal, zero shows as absent".

So the extra # will show another digit unless it is 0. Thus the two formats give the same results.

Edit: In fact they both do nothing because they are equivalent to the default formatting.