• uis@lemm.ee
    link
    fedilink
    arrow-up
    1
    ·
    5 months ago

    Basically you can say that time it takes never goes above grapf of some function scaled by constant.

    Fun side effect of this is that you can call your O(1) algorithm is O(n!) algorithm and be technically correct.

    • Mikina@programming.dev
      link
      fedilink
      arrow-up
      2
      ·
      edit-2
      4 months ago

      Here is a picture, that may help a little bit. The n is input size, and f(n) is how long does the algorithm runs (i.e how many instructions) it takes to calculate it for input for size n, i.e for finding smallest element in an array, n would be the number of elements in the array. g(n) is then the function you have in O, so if you have O(n^2) algorithm, the g(n) = n^2

      Basically, you are looking for how quickly it grows for extreme values of N, while also disregarding constants. The graph representation probably isn’t too useful for figuring the O value, but it can help a little bit with understanding it - you want to find a O function where from one point onward (n0), the f(n) is under the O function all the way into infinity.