This has been discussed many times before. Computers can't represent floating point numbers accurately - it doesn't matter how "simple" they may seem.
en.wikipedia.org/wiki/Floating-point_arithmetic
If you don't actually need floating point, try to keep your numbers as integers, and scale them to floating point at the last moment when really needed. (That is, multiply by 10 initially, and divide by 10 latter, for example).