I assume it is this part of the line that is confusing:
((1 << collision.gameObject.layer) & groundLayerMask)
You could try to read up in “bit fields”, first google hit is: https://en.wikipedia.org/wiki/Bit_field
What’s happening here is that “groundLayerMask” is what its name implies, a binary mask which specifies zero, one or any combination of 32 possible mask (since an enum usually derives from int which is 32 bits), while “collision.gameObject.layer” is an integer referring to just one possible single layer/bit.
So “1 << collision.gameObject.layer” converts from a position (0,1,2,…) to a single bit (1,2,4,8,…), like this:
0 -> 0000 0001 = 1
1 -> 0000 0010 = 2
2 -> 0000 0100 = 4
3 -> 0000 1000 = 8
and so on
And so “((1 << collision.gameObject.layer) & groundLayerMask)” will be zero if groundLayerMask does not contain the layer collision.gameObject.layer refers to, and non-zero if it does.
solved Need help to understand how this part of the code works