Artificial intelligent assistant

Why a normal distribution would not give a good approximation to the distribution of marks An examination is marked out of $100$. It is taken by a large number of candidates. The mean mark, for all candidates, is $72.1$ and the standard deviation is $15.2$. Give a reason why a normal distribution, with this mean and standard deviation, would not give a good approximation to the distribution of marks. My answer: Since the standard deviation is quite large ($=15.2)$, the normal curve will disperse wildly. Hence, it is not a good approximation. Is my answer acceptable?

The maximum possible mark of $100$ is about $1.836$ standard deviation units above the mean.

There is a probability of about $3.4\%$ that a normally distributed random variable is more than $1.836$ standard deviation units above the mean. So at the upper end of the range at least, the normal with the given mean and standard deviation gives a poor fit.

_Remark_ : Without additional detail, your proposed answer is not sufficient. If the mean were something like $55$, with a standard deviation of $15.2$, that would be compatible with the normal providing a reasonably good fit.

xcX3v84RxoQ-4GxG32940ukFUIEgYdPy 83f72f09a797579748b1fe7bb158be76