Artificial intelligent assistant

Which is a better measure of uncertainty: standard deviation or resolution uncertainty? Let's say I take a ton of measurements of the diameter of a marble with a 0.001" resolution micrometer, and I calculate the standard deviation of the sample set. Would the standard deviation be a better measure of uncertainty than the resolution of the instrument?

The resolution on the meter only tells you that you can't clearly resolve measurements observed between tick marks. It does not tell you how consistently the results are recorded. Repeated measurement of an object of a known length will directly measure the recording error variability through the estimated standard deviation. This will not be exact. It is subject to sampling variability (i.e. it is a sample estimate of a population parameter).

The standard deviation is a measure of the accuracy which includes meaasurement precision and bias in recording whereas instrument resolution only measures precision. So it can understate accuracy. The sample variance will be statistically an unbiased estimate of variance which does measure accuracy.

xcX3v84RxoQ-4GxG32940ukFUIEgYdPy ce9ecb9b09ace6b7318db61bb4c277bd