C# decimal multiplication strange behavior

decimal stores 28 or 29 significant digits (96 bits). Basically the mantissa is in the range -/+ 79,228,162,514,264,337,593,543,950,335.

That means up to about 7.9…. you can get 29 significant digits accurately – but above that you can’t. That’s why both the 8 and the 9 go wrong, but not the earlier values. You should only rely on 28 significant digits in general, to avoid odd situations like this.

Once you reduce your original input to 28 significant figures, you’ll get the output you expect:

using System;

class Test
{
    static void Main()
    {
        var input = 1.111111111111111111111111111m;
        for (int i = 1; i < 10; i++)
        {
            decimal output = input * (decimal) i;
            Console.WriteLine(output);
        }
    }
}

Leave a Comment

tech