Artificial intelligent assistant

Mixing Randomness and Determinism Was thinking of putting this onto Philosophy SE but maybe the terms are too technical for over there. Thinking about the density of functions of random variables, there seem to be two extremes: the uniformly distributed random variable (the 'most' random), and the constant random variable (the 'least' random) which can be thought of as having a dirac delta function as its pdf (see:Is a constant a random variable?). The normal distribution can be seen as a midway point,with variance going to 0 or $\infty$ being the limitting cases. Is there any literature on this relationship of mixing total randomness and determinism (for lack of a better word) ?

Sort of, but it's not super profound. The normal distribution might be described as:

$$PDF = \frac{1}{\sqrt{2\pi\sigma^2}} e^{\frac{-(x-\mu)^2}{2\sigma^2}}$$

We usually consider $x$ to be the variable and $\mu$ and $\sigma$ to be the known parameters of the underlying random variable. However, it doesn't necessarily have to be that way. Think about the limit as $\sigma \rightarrow 0$. You end up with a variable whose value is almost always the mean, do you not? That would be what many might call a constant.

So if you consider, instead of "the" normal distribution, the family of distributions defined by the normal distributions, then you end up with something like what you are suggesting. Your task, then, would be to find some family of curves that, when suitably parameterized, give you other pdfs that have the properties you want.

TL;DR: I believe that a random variable becoming "less random" is akin to its parameters reaching some limiting values.

xcX3v84RxoQ-4GxG32940ukFUIEgYdPy 2fbbf13e6366c79e9d5e740e12f6b6cf