How do I interpret precision and scale of a number in a database?

I have the following column specified in a database: decimal(5,2)

How does one interpret this?

According to the properties on the column as viewed in SQL Server Management studio I can see that it means: decimal(Numeric precision, Numeric scale).

What do precision and scale mean in real terms?

It would be easy to interpret this as a decimal with 5 digits and two decimals places…ie 12345.12

P.S. I’ve been able to determine the correct answer from a colleague but had great difficulty finding an answer online. As such, I’d like to have the question and answer documented here on stackoverflow for future reference.

4 Answers
4

Leave a Comment