Home >  Term: big-O notation
big-O notation

A theoretical measure of the execution of an algorithm, usually the time or memory needed, given the problem size n, which is usually the number of items. Informally, saying some equation f(n) = O(g(n)) means it is less than some constant multiple of g(n). The notation is read, "f of n is big oh of g of n". Formal Definition: f(n) = O(g(n)) means there are positive constants c and k, such that 0 ≤ f(n) ≤ cg(n) for all n ≥ k. The values of c and k must be fixed for the function f and must not depend on n.

0 0

Δημιουργός

  • GeorgeV
  •  (Gold) 1123 points
  • 100% positive feedback
© 2024 CSOFT International, Ltd.