A student wants to estimate the mean score of all college students for a particular exam. First use the range rule of thumb to make a rough estimate of the standard deviation of those scores. Possible scores range from 100 to 2000. Use technology and the estimated standard deviation to determine the sample size corresponding to a 90% confidence level and a margin of error of 100 points. What isn't quite right with this exercise?
The range rule of thumb estimate for the standard deviation is __