Artificial intelligent assistant

Calculating the "jaggedness" of a distribution I'm sure "jaggedness" isn't the right term to use here, so please correct me. I'm trying to quantify how jagged a distribution is. For example, this is moderately jagged: distribution #1 This is really jagged: distribution #2 Essentially, I want to quantify the number of peaks and the disparity between peaks and valleys in a meaningful way, normalized for size of data. So in the above example, distribution #2 has a larger amount of data than distribution #1, but I don't want that to affect the quantification. How would I do this?

Try calculating the $L_2$ norm. It's going to be much bigger for distribution #2. If the histogram is $h_i$, normalized so that the $L_1$ norm is $1$, i.e. $\sum_i h_i = 1$, then the $L_2$ norm (squared) is $$L_2^2 = \sum_i h_i^2.$$ One problem you're going to encounter is that the noise is proportional to $\sqrt{n}$, so you should normalize (divide) the $L_2^2$ norm by $n$ (I think). To make sure you have the correct normalization, sample your input and compare the normalized values; they should be about the same if you take all points or, say, only half of them (try even taking only, say, a tenth).

xcX3v84RxoQ-4GxG32940ukFUIEgYdPy d81019bb2030753aea471a7690115227