I'm curious as to whether or not there is a real difference between the money
datatype and something like decimal(19,4)
(which is what money uses internally, I believe).
I'm aware that money
is specific to SQL Server. I want to know if there is a compelling reason to choose one over the other; most SQL Server samples (e.g. the AdventureWorks database) use money
and not decimal
for things like price information.
Should I just continue to use the money datatype, or is there a benefit to using decimal instead? Money is fewer characters to type, but that's not a valid reason :)
Never ever should you use money. It is not precise, and it is pure garbage; always use decimal/numeric.
Run this to see what I mean:
DECLARE
@mon1 MONEY,
@mon2 MONEY,
@mon3 MONEY,
@mon4 MONEY,
@num1 DECIMAL(19,4),
@num2 DECIMAL(19,4),
@num3 DECIMAL(19,4),
@num4 DECIMAL(19,4)
SELECT
@mon1 = 100, @mon2 = 339, @mon3 = 10000,
@num1 = 100, @num2 = 339, @num3 = 10000
SET @mon4 = @mon1/@mon2*@mon3
SET @num4 = @num1/@num2*@num3
SELECT @mon4 AS moneyresult,
@num4 AS numericresult
Output: 2949.0000 2949.8525
To some of the people who said that you don't divide money by money:
Here is one of my queries to calculate correlations, and changing that to money gives wrong results.
select t1.index_id,t2.index_id,(avg(t1.monret*t2.monret)
-(avg(t1.monret) * avg(t2.monret)))
/((sqrt(avg(square(t1.monret)) - square(avg(t1.monret))))
*(sqrt(avg(square(t2.monret)) - square(avg(t2.monret))))),
current_timestamp,@MaxDate
from Table1 t1 join Table1 t2 on t1.Date = traDate
group by t1.index_id,t2.index_id