Difference between DECIMAL and NUMERIC

leeeroy picture leeeroy · Dec 3, 2009 · Viewed 40k times · Source

What's the difference between the SQL datatype NUMERIC and DECIMAL ? If databases treat these differently, I'd like to know how for at least:

  • SQL Server
  • Oracle
  • Db/2
  • MySQL
  • PostgreSQL

Furthermore, are there any differences in how database drivers interpret these types?

Answer

David picture David · Dec 3, 2009

They are the same for almost all purposes.

At one time different vendors used different names (NUMERIC/DECIMAL) for almost the same thing. SQL-92 made them the same with one minor difference which can be vendor specific:

NUMERIC must be exactly as precise as it is defined — so if you define 4 decimal places to the left of the decimal point and 4 decimal places to the right of it, the DB must always store 4 + 4 decimal places, no more, no less.

DECIMAL is free to allow higher numbers if that's easier to implement. This means that the database can actually store more digits than specified (due to the behind-the-scenes storage having space for extra digits). This means the database might allow storing 12345.0000 in the above example of 4 + 4 decimal places, but storing 1.00005 is still not allowed if doing so could affect any future calculations.

Most current database systems treat DECIMAL and NUMERIC either as perfect synonyms, or as two distinct types with exactly the same behavior. If the types are considered distinct at all, you might not be able to define a foreign key constrain on a DECIMAL column referencing a NUMERIC column or vice versa.