O(logn):
O(logn) means that an algorithm’s run time grows logarithmically as the size of the input grows or each operation, the algorithm reduces the size of the problem by a certain factor, often halving it. This makes the algorithm very efficient, especially for large datasets.
In O(logn), you can eliminate a large portion of your problem space at each step. For example, if you’re searching for a specific item in a sorted list, you don’t need to check each item one by one; you can start in the middle, see if the item you’re looking for is larger or smaller, and then eliminate half of the list from consideration. You can keep doing this, cutting your work in half each time, until you find what you’re looking for.
Think of it like searching for a word in a dictionary. You wouldn’t start from the first page and go word by word. You’d open the dictionary somewhere near the middle and adjust your search based on whether the word comes before or after the page you’re currently looking at. You keep reducing the number of pages you have to look through by half each time until you find the word you’re looking for.
double temp = power(x, n / 2);
if (n % 2 == 0) {
return temp * temp;
} else {
return x * temp * temp;
}
The function power is called with n/2, effectively reducing the problem size by half in each recursive call. This behaviour makes it O(long).
while (b != 0) {
int temp = b;
b = a % b;
a = temp;
}
Euclid’s algorithm for finding the GCD of two numbers operates in logarithmic time relative to the input numbers. It repeatedly replaces the larger number by its remainder until the numbers become zero.
for (int i = 1; i <= n; i *= 2) {
// Some constant-time operations
}
Here, the loop index i starts from 1 and doubles in each iteration. This means the loop will run for the values 1,2,4,8,16,β¦ until i exceeds n. The number of times this loop runs is proportional to the number of times you can double 1 before surpassing n. This is logarithmic behaviour, giving us O(logn) complexity.
while (n > 1) {
// Some constant-time operations
n = n / 2;
}
In this loop, the value of n is halved in each iteration. Therefore, the loop will run for as many times as n can be halved until it becomes 1 (or less). This halving behaviour leads to logarithmic time complexity, O(long).
for (int i = 1; i <= n; i *= 2) {
for (int j = 1; j <= i; j++) {
// Some constant-time operations
}
}
When i = 1, the inner loop runs 1 time.
When i = 2, the inner loop runs 2 times.
When i = 4, the inner loop runs 4 times.
And so on…
If you sum up these runs, you get 1 + 2 + 4 + 8 + … up to n. This summation is essentially a geometric series and is bounded by 2n. Therefore, despite the nested loops, the overall complexity is O(n). It’s essential to understand that not every loop structure with apparent logarithmic traits (like doubling or halving) will always result in O(log n) time complexity.
LinkedIn Post: https://www.linkedin.com/posts/t-yashwanth-naidu_learnwithyash-earlycareer-embeddedsystems-activity-7105560439652028418-jkJk/?utm_source=share&utm_medium=member_desktop
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
An Article by: Yashwanth Naidu Tikkisetty
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
