If we multiply each number by 10000 we will lose the decimal point but we won't alter which number is bigger. And all we have to do to multiply by 10000 is move the decimal point 4 places to the right, adding a zero if we run out of digits: 20 and 16; the decimal point isn't needed because there's nothing to follow it. Now it's clear that 20 is bigger than 16, so 0.002% is bigger than 0.0016%.