I’m using a decimal column to store money values on a database, and today I was wondering what precision and scale to use.
Since supposedly char columns of a fixed width are more efficient, I was thinking the same could be true for decimal columns. Is it?
And what precision and scale should I use? I was thinking precision 24/8. Is that overkill, not enough or ok?
This is what I’ve decided to do:
- Store the conversion rates (when applicable) in the transaction table itself, as a float
- Store the currency in the account table
- The transaction amount will be a
DECIMAL(19,4)
- All calculations using a conversion rate will be handled by my application so I keep control of rounding issues
I don’t think a float for the conversion rate is an issue, since it’s mostly for reference, and I’ll be casting it to a decimal anyway.
Thank you all for your valuable input.