Usually the square root of a number is much smaller than the number you start with, because you can multiply the two square roots together and come up with you original number. BUT, brain quiz: Why is it that, when you take the square root of a decimal, like .7 you come up with LARGER numbers than the square root? Like, the square root of .7 is .84. It is very strange that a square root is larger than the original number.