Big Oh notation loop manipulation
If I had two nested loops where the outer had a big oh notation of logn and the inner had one of n does that mean the overall notion would be log2n? because since n changes every time the outer loop executes, then the inner loop is technically running logn times but it loops n times. I apologise if this question sounds stupid. This is how the loop looks like: outer loop runs while n>0 inner loops runs n times n=(1/4)n I'm sorry if my formatting is off, I spent a few minutes trying to figure out how to use latex here and couldn't quite crack it
Time complexity is O(n): First time, inner loop iterates n times Second time, inner loop iterates n/4 times Third time, inner loop iterates n/16 times ... K'th time, inner loop iterates n/(4^k) times Summing them up: n + n/4 + n/16 + ... + n/4^k + ... + n/(4^log_4(n)) This is sum of geometric series with: a = n r = 1/4 And it's sums is bounded by: (n/3/4) = 4n/3 accorsing to the formula for r < 1
how to calculate time complexity in big O notation of this algorithm
Where do the functions 2n^2 , 100n log n, and (log n) ^3 fit in the big-O hierarchy?
O(N) execution times
Using Big O notation for counting nodes in a tree
Big-O Hierarchy and comparisons
Recurrence relation of the following function
What is the O notation of this loop?
Mathematical Definition of Asymptotic Bound
Is O(n Log n) in polynomial time?
Why is O(n^2) the same as Θ(n^2)?
Finding the constant part of an algorithms running time
Time complexity of nested for loop?
Is O(log(n*log n) can be considered as O(log n)
Time complexity of recursive algorithms with branches of different complexity
What is the difference between O(1) and Θ(1)?
Time complexity (in big-O notation) of the following code?