My fellow programmer has a strange requirement from his team leader; he insisted on creating varchar
columns with a length of 16*2n.
What is the point of such restriction?
I can suppose that short strings (less than 128 chars for example) a stored directly in the record of the table and from this point of view the restriction will help to align fields in the record, larger strings are stored in the database "heap" and only the reference to this string is saved in the table record.
Is it so?
Is this requirement has a reasonable background?
BTW, the DBMS is SQL Server 2008.
Completely pointless restriction as far as I can see. Assuming standard FixedVar
format (as opposed to the formats used with row/page compression or sparse columns) and assuming you are talking about varchar(1-8000)
columns
All varchar
data is stored at the end of the row in a variable length section (or in offrow pages if it can't fit in row). The amount of space it consumes in that section (and whether or not it ends up off row) is entirely dependant upon the length of the actual data not the column declaration.
SQL Server will use the length declared in the column declaration when allocating memory (e.g. for sort
operations). The assumption it makes in that instance is that varchar
columns will be filled to 50% of their declared size on average so this might be a better thing to look at when choosing a size.