Artificial intelligent assistant

Formula to round decimal values I'm using an application, which offers a feature of creating user-defined functions. Available set of methematical operations which could be incorporated is rather small, namely I can use: 1. addition 2. substraction 3. multiplication 4. division Also, I can use IIF control statement with the default signature of: _IIF(expression,truePart,falsePart)_ Now, my goal is to create a function which could calculate rounded values from input decimals. The precision is set to the second position from the decimal point. However, it would be very nice if the precision could be parametrized through some input argument to the formula. For example, if the input variable holds value of 3.14159, then the result of the formula would be 3.14 Can you advice me on how to define this formula?

Given the poverty of the instruction set, this is barely possible.

If there is a known maximum possible value, say `32767`, you can implement a floor function by dichotomic search.


d= IFF(x<16384, x, x-16384)
d= IFF(d<8192, d, d-8192)
d= IFF(d<4096, d, d-4096)
d= IFF(d<2048, d, d-2048)
d= IFF(d<1024, d, d-1024)
d= IFF(d<512, d, d-512)
d= IFF(d<256, d, d-256)
d= IFF(d<128, d, d-128)
d= IFF(d<64, d, d-64)
d= IFF(d<32, d, d-32)
d= IFF(d<16, d, d-16)
d= IFF(d<8, d, d-8)
d= IFF(d<4, d, d-4)
d= IFF(d<2, d, d-2)
d= IFF(d<1, d, d-1)


Then `x-d` is `floor(x)`.

From that you derive


round(x)= floor(x+0.5)


and


round(x, scale)= round(x * scale) / scale


where scale is a power of `10`.

xcX3v84RxoQ-4GxG32940ukFUIEgYdPy e3f378d9973b435ec706eba94c49fc0b