A common problem developers run into when working with currency is floating-point precision errors.
0.1 + 0.2 = 0.30000000000000004 // javascript
On small amounts this looks harmless, but at scale it becomes dangerous! Over millions of transactions, these tiny inaccuracies can compound and can lead to reporting errors.
So why not just use cents?
100 + 256 = 356 // R3.56 or $3.56
Simple. right?
But what about currencies with huge numbers?
Let’s take the Indonesian Rupiah as an example.
currently @ R1,000 = Rp 1,000,000
Most programming languages can handle the value of a large integer up to 9,223,372,036,854,775,807 (That’s roughly 9 quintillion)
To actually hit this limit using cents, you’d need transaction data equivalent to about 1 quadrillion rands.
To reach 1 quadrillion rand, you would need to average:
R10 trillion per year FOR 100 years.
That’s more than the entire South African GDP, every year, for a century.
Performance??? Dude really??
If your system is processing that much transaction volume, you probably already have serious infrastructure, accounting, and scaling strategies in place. At this scale the price of RAM is not a concern.

Leave a Reply