what can be improved on a given random function to make it more random or for a bigger range or something else?
and therefore no SRANDOM can be used up to now.
How to improve the randomness of the fuction above, if possible ?
Sooo write your own SRANDOM with your own semantics. Ex:
srandom() {
# take random number from /dev/urandom
# we take only just 4 bytes - one 2^32 number
printf "%d\n" "0x$(
dd if=/dev/urandom of=/dev/stdout bs=4 count=1 status=none |
xxd -p)"
}
and then:
normalize_value(){
...
rnd=$(srandom)
rnd_count=$((rnd / ...))
}
Accepting a wider range of numbers
If you are not happy with the way shell arithmetic expansion works, then… use a different tool. bc
calculator has unlimited range.
rnd_count=$(echo "
# see https://superuser.com/questions/31445/gnu-bc-modulo-with-scale-other-than-0
scale=0;
# super big random number from three 2^32 numbers
rnd = $(srandom) * 2^(32*2) + $(srandom) * 2^32 + $(srandom)
rnd % ($max - $min + 1) + $min
" | bc)
You can write your own C program with getrandom()
and compile it on the fly echo "int main() { stuff(); }" | gcc -xc - && ./a.out; rm ./a.out
basically granting you any semantics you want. There are also other scripting languages, like perl, python, ruby, all most probably with their own big-number libraries and urandom number generation implementations. Sky the limit.
Every improvement should use bash only.
Is from my perspective a pointless limitation – overall, I am paid for results, not really “how” I solve problems. Anyway, you could, giving you a bunch of ideas how to proceed:
- First write a function that would read from
/dev/urandom
and convert the bytes into a number.- I have no good idea how to do it in pure bash while keeping the randomness at sane levels. I suspect the input will drain fast.
- You could read one byte from urandom. You’ll have to ignore
read
exit status, cause the byte may be zero byte or newline. - Then check if that byte is a digit. If it’s not, repeat previous step.
- Treat such algorithm treat as a generator of random number within the range of 0-9. Build bigger numbers from those digits.
- Then develop your own big-number library using arithmetic expansion as a “backend”, written in bash.
- Seems pretty pointless, because
bc
is commonly available. - This would work as usual big number libraries do.
- I suggest to store the number as an array of number at max 2^16. For inspiration, research similar libraries written in C and C++ languages, and convert it to bash.
- Seems pretty pointless, because
2
solved How to improve the uniformly distributed of a given random function to generate uniformly distributed numbers? [closed]