Artificial intelligent assistant

What is the expected error of a randomly generated number? Forgive me if this question is unclear, as I'm not a mathematician. The question has come up in an industrial sensor application. I am trying to make the displayed sensor value to be more steady similar to a hysteresis calculation. The following graph comes from the "Expected Value" article on Wikipedia. It shows how the mean of the set of dice values converges on its expectation as the number of rolls gets bigger. What is the expected deviation of the average dice roll values from their expectation, given a number of dice rolls? Said differently, is there a way to calculate a bounds for the difference between the red and green lines on the graph below? What about for the case of a set of random integers [-x , x]? A simple direction to further reading would be great. !Expected Value graphic from the Wikipedia

The variance of the sample mean is $\sigma/\sqrt{n}$, where $\sigma$ is the s.d. Of 1 sample. By clt, the sample mean is distributed approximately as normal with mean equalling the mean and variance given above.

This can be used to work out the approximate probability that it lies in a given interval.

xcX3v84RxoQ-4GxG32940ukFUIEgYdPy 2cdcfe255d5d2653448c246467e7b0d5