[Solved] Creating a Program that slowly Increases the brightness of an LED as a start-up


For pulse width modulation, you’d want to turn the LED off for a certain amount of time, then turn the LED on for a certain amount of time; where the amounts of time depend on how bright you want the LED to appear and the total period (“on time + off time”) is constant.

In other words, you want a relationship like period = on_time + off_time where period is constant.

You also want the LED to increase brightness slowly. E.g. maybe go from off to max. brightness over 10 seconds. This means you’ll need to loop total_time / period times.

How bright the LED should be, and therefore how long the on_time should be, will depend on how much time has passed since the start of the 10 seconds (e.g. 0 microseconds at the start of the 10 seconds and period microseconds at the end of the 10 seconds). Once you know the on_time you can calculate off_time by rearranging that “period = on_time + off_time” formula.

In C it might end up something like:

#define TOTAL_TIME    10000000               // 10 seconds, in microseconds
#define PERIOD        1000                   // 1 millisecond, in microseconds
#define LOOP_COUNT    (TOTAL_TIME / PERIOD)

    int on_time;
    int off_time;

    for(int t = 0; t < LOOP_COUNT; t++) {
         on_time = period * t / LOOP_COUNT;
         off_time = period - on_time;

         turn_LED_off();
         __delay_us(off_time);

         turn_LED_on();
         __delay_us(on_time);
    }

Note: on_time = period * t / LOOP_COUNT; is a little tricky. You can think it as on_time = period * (t / LOOP_COUNT); where t / LOOP_COUNT is a fraction that goes from 0.00000 to 0.999999 representing the fraction of the period that the LED should be turned on, but if you wrote it like that the compiler will truncate the result of t / LOOP_COUNT to an integer (round it towards zero) so the result will be zero. When written like this; C will do the multiplication first, so it’ll behave like on_time = (period * t) / LOOP_COUNT; and truncation (or rounding) won’t be a problem. Sadly, doing the multiplication first solves one problem while possibly causing another problem – period * t might be too big for an int and might cause an overflow (especially on small embedded systems where an int could be 16 bits). You’ll have to figure out how big an int is for your computer (for the values you use – changing TOTAL_TIME or PERIOD with change the maximum value that period * t could be) and use something larger (e.g. a long maybe) if an int isn’t enough.

You should also be aware that the timing won’t be exact, because it ignores time spent executing your code and ignores anything else the OS might be doing (IRQs, other programs using the CPU); so the “10 seconds” might actually be 10.5 seconds (or worse). To fix that you need something more complex than a __delay_us() function (e.g. some kind of __delay_until(absolute_time) maybe).

Also; you might find that the LED doesn’t increase brightness linearly (e.g. it might slowly go from off to dull in 8 seconds then go from dull to max. brightness in 2 seconds). If that happens; you might need a lookup table and/or more complex maths to correct it.

solved Creating a Program that slowly Increases the brightness of an LED as a start-up