Money - 3axap-4/Patterns GitHub Wiki
Represents a monetary value.
A large proportion of the computers in this world manipulate money, so it's always puzzled me that money
isn't actually a first class data type in any mainstream programming language. The lack of a type causes
problems, the most obvious surrounding currencies. If all your calculations are done in a single currency, this
isn't a huge problem, but once you involve multiple currencies you want to avoid adding your dollars to your
yen without taking the currency differences into account. The more subtle problem is with rounding.
Monetary calculations are often rounded to the smallest currency unit. When you do this it's easy to lose
pennies (or your local equivalent) because of rounding errors.
The basic idea is to have a Money class with fields for the numeric amount and the currency. You can store
the amount as either an integral type or a fixed decimal type.
Money is a Value Object
Money needs arithmetic operations so that you can use money objects as easily as you use numbers. But
arithmetic operations for money have some important differences to money operations in numbers. Most
obviously, any addition or subtraction needs to be currency aware so you can react if you try to add together
monies of different currencies.
The awkward complication comes with rounding, particularly when allocating money between different
places.
Here's Matt Foemmel's simple conundrum. Suppose I have a business rule that says that I have to allocate the whole amount of a sum of money to two accounts: 70% to one and 30% to another. I have 5 cents to allocate. If I do the math I end up with 3.5 cents and 1.5 cents. Whichever way I round these I get into trouble. If I do the usual rounding to nearest then 1.5 becomes 2 and 3.5 becomes 4. So I end up gaining a penny. Rounding down gives me 4 cents and rounding up gives me 6 cents. There's no general rounding scheme I can apply to both that will avoid losing or gaining a penny.
I've seen various solutions to this problem.
- Perhaps the most common is to ignore itafter all, it's only a penny here and there. However this tends to make accountants understandably nervous.
- When allocating you always do the last allocation by subtracting from what you've allocated so far. This avoids losing pennies, but you can get a cumulative amount of pennies on the last allocation.
- Allow users of a Money class to declare the rounding scheme when they call the method. This permits a programmer to say that the 70% case rounds up and the 30% rounds down. Things can get complicated when you allocate across ten accounts instead of two. You also have to remember to round. To encourage people to remember I've seen some Money classes force a rounding parameter into the multiply operation. Not only does this force the programmer to think about what rounding she needs, it also might remind her of the tests to write. However, it gets messy if you have a lot of tax calculations that all round the same way.
- My favorite solution: have an allocator function on the money. The parameter to the allocator is a list
of numbers, representing the ratio to be allocated (it would look something like
aMoney.allocate([7,3])). The allocator returns a list of monies, guaranteeing that no pennies
get dropped by scattering them across the allocated monies in a way that looks pseudo-random from
the outside. The allocator has faults: You have to remember to use it and any precise rules about
where the pennies go are difficult to enforce.
You may want to convert from one currency to another with a method like
aMoney.convertTo(Currency.DOLLARS).
Comparison operations allow you to sort monies.