Artificial intelligent assistant

What is this scaling operation called? We have values representing measurements stored inside a grid and they range between [0,1]. These measurements are degraded using following formula $$\text{measurement} = \text{measurement} / (1 + \text{degradation factor}) $$ Degradation factor is usually between $(0,1)$ What is this operation called ? And does it have any significance?

We have

$M = \frac{M}{1+d} = M \frac{1}{1+d}$

and $d \in (0,1) \implies 1+d \in (1,2) \implies \frac{1}{1+d} \in (0.5,1)$

thus we are simply scaling, "stretching", "squishing" by a little bit, probably to make it fit with other data. The scale is simply changed a little. Everything but the magnitude is left intact, even the proportion of one point to another remains. I think scaling is correct, or down-scaling since it can only remain the same or get smaller.

xcX3v84RxoQ-4GxG32940ukFUIEgYdPy 1f3627d442652b71465a43daa238148c