It’s simple, you multiple the number of dollars by 100 and you get the number of cents, then you calculate how many nickels, dimes etc that would be.
For example, $5.26 is 526 cents, which is 10 half dollars ($5), 1 quarter, 1 penny.
Integer division gives you the number of times X is contained in Y, and modulus (%) gives you the remainder.
526 / 50 = 10
526 % 50 = 26
26 / 25 = 1
26 % 25 = 1
1 / 10 = 0
1 % 10 = 1
1 / 5 = 0
1 % 5 = 1
1 / 1 = 1
1 % 1 = 0
Obviously, what that does is break the amount of money in the least number of coins by giving you as many coins of the highest denomination first, then the highest number of coins of the next denomination etc.
4
solved I dont understand how the different amounts are calculated [closed]