The idea is to calculate how the execution time for a piece of code – typically some algorithm, depends on the input. If a function takes N as input how long will it take to execute if N=5 or N=10. Will it take double as long? Will it take the same time? Or will it take more than double?
In your case:
The first program doesn’t depend on any input so it is O(n)=1.
Your second program depends on n. It will do the same stuff n/2 times due to the a = a + 2. So it is O(n)=n/2. However constants are typically skipped and one would write O(n) = n.
If you had code like this:
for (a=0; a < n; a++)
{
// n times here
for (b=0; b<n; b++)
{
// n times here
// do something
}
}
the execution time will change to n^2 because both loops will iterate n times. Each time the outer loop executes, the inner loop executes n times. Since the outer loop executes n times, you have n*n. So O(n) = n^2
1
solved i want to calculate the T(n) for the two algorithms