0.03 x 1,000=30! In multiplying decimals, you'll move the decimal's period to the number of zeroes in the multiplier! 1,000 has 3 zeroes. So, it's equal to 30. So basically he was wrong because 0.03 times 1,000 doesn't equal 3 it equals 30. He made the mistake of not moving the decimal point enough.