Precise vs Accurate on Arbitrary-precision Arithmetic
When Math isn't accurate in code
Precise vs Accurate
So here's a simple example to get you started, punch a simple calculation 0.1 +
0.2 into any calculator, scientific, google, whatever.
What result do you get?
You should see 0.3 right?
The code issues