How do I interpret precision and scale of a number in a database?

mezoid picture mezoid · Mar 4, 2010 · Viewed 290.4k times · Source

I have the following column specified in a database: decimal(5,2)

How does one interpret this?

According to the properties on the column as viewed in SQL Server Management studio I can see that it means: decimal(Numeric precision, Numeric scale).

What do precision and scale mean in real terms?

It would be easy to interpret this as a decimal with 5 digits and two decimals places...ie 12345.12

P.S. I've been able to determine the correct answer from a colleague but had great difficulty finding an answer online. As such, I'd like to have the question and answer documented here on stackoverflow for future reference.

Answer

mezoid picture mezoid · Mar 4, 2010

Numeric precision refers to the maximum number of digits that are present in the number.

ie 1234567.89 has a precision of 9

Numeric scale refers to the maximum number of decimal places

ie 123456.789 has a scale of 3

Thus the maximum allowed value for decimal(5,2) is 999.99