MIT Algorithms (1)

1)  f(n) = O(g(n))
    It means there are consts C > 0, n > 0 such that 0 <= f(n) <= c * g(n) for sufficiently large n.
    f(n) = Ω(g(n))
    It means there are consts C > 0, n > 0 such that 0 <= c * g(n) <= f(n) for sufficiently large n.
    
    O notation, less than or equal to.
    Ω notation, greater than or equal to.
    Θ notation, less than or equal to AND greate than or equal to.
2)  Macro convention
    A set in a formula represnts on anonymous function in that set.
    
3)  big O is upper bounds
O notation, less than or equal to.

    e.g
      f(n) = n3 + O(n2)
      there in a function h(n) = O(n2) such that f(n) = n3 + h(n)
    e.g
      n2+O(n)=O(n2)
      the qual sign is asymmetric
      you can read it as 'everything in left side is something in right side.
      anything that is n2 + O(n) is also O(n2)
      
      for any f(n)∈O(n), there is an h(n)∈O(n2) such taht n2 + f(n) = h(n)
    
3)  Ω notation is lower bounds
    Ω notation, greater or equal to.
    e.g
      √n = Ω(lgn)
      √n is at least a consts time of lgn for sufficiently large n.
    
4)  Θ notation
    Θ notation, less than or equal to AND greater than or equal to.
    
    Θ(g(n)) = O(g(n)) ∩ Ω(g(n))
    
5)  strict notation
    o & ω
    
    o is going to correspond to less than
    ω is going to correspond to greater than
    
    this inequality must hold for all C instead of just for 1
    
6)  Solving recurrences
    there are 3 main methods,
    a. substitution method
      (1) Guess the answer(form) of the solution
      (2) Verify by induction
      (3) Solve for consts
      
      e.g
        Question:
          T(n) = 4 * T(n/2) + n
          ( T(1) = θ(1) )
        
        Solve:
          Guess:
            T(n) = O(n3)
          Assume:
            T(k) <= C * k3 for k < n
            T(n) = 4 * T(n/2) + n
                 <=  4 * C * (n/2)3 + n
                 = 1/2 * C * n3 + n
                 = C * n3 - (1/2 * C * n3 - n)
                
            if (1/2 * C * n3 - n) is greater than or equal to 0
            so, C = 2, n >= 1
            (but this is NOT the tight upper bound)
          Guess:
            T(n) = O(n2)
          Assume:
            T(k) <= C * k2 for k < n
            T(n) = 4 * T(n/2) + n
                 <=  4 * C * (n/2)2 + n
                 = C * n2 + n
                 = C * n2 - (-n)
                 (doesn't work)
          Assume:
          T(k) <= C1 * k2 - C2 * k for k < n
          T(n) = 4 * T(n/2) + n
               = 4 * (C1 * (n/2)2 - C2*(n/2)) + n
               = C1 * n2 + (1 - 2 * C2) * n
               = C1 * n2 - C2 * n - (-1 + C2) * n
              
               want (-1 + C2) * n >= 0
               if C2 >= 1, then it works
              
               <= C1 * n2 - C2 * n if C2 >= 1
               (and C1 has to be sufficiently large enough)
              
               base case:
                 T(1) = C1 - C2
                 T(1) = θ(1)
                
    b. Recursion-tree method
      e.g
        T(n) = T(n/4) + T(n/2) + n2
        Solve:
          Expand:
             (1)
             T(n) = n2
                   /  
              T(n/4) + T(n/2)
              
             (2)
                           n2
                       /      
                      /        
                   (n/4)2       (n/2)2
                   /            /    
            T(n/16) + T(n/8) T(n/8) + T(n/4)
              .
              .
              .
            
             (n)
               strict less than n leaves
              
             Sum:
               n2
               5/16 * n2
               25/256 * n2
               .
               .
               .
              
               <= n2 * (1 + 5/16 + 25/256 + ... + 5^k/16^k ... )
               < 2n2
               = O(n2)
               = Θ(n2)
              
    c. Master method
      only apply to recurences of the form T(n) = aT(n/b) + f(n)
      where a >= 1, b > 1, f(n) should be asymptotically positive.
      
      asymptotically positive:
        for large enough n, f(n) is positive
        
      Apply:
        Compare f(n) with n^logb(a)
            Case 1: f(n) is samller
              f(n) = O(n^logb(a) - ε) for some ε>0
              then T(n) = θ(n^logb(a))
            
            Case 2: f(n) is pretty much close to
              f(n) = O(n^logb(a) * (lg(n))^k) for some k >= 0
              then T(n) = θ(n^logb(a) * (lgn)^(k+1))
              
            Case 3: f(n) grows bigger than n^logb(a)
              f(n) = Ω(n^logb(a)+ε) for some ε>0
              
              AND:
                a * f(n/b) <= (1 - ε') * f(n) for some ε'>0
              then T(n) = θ(f(n))
              
    e.g
      T(n) = 4 * T(n/2) + n
      Solve:
        a = 4
        b = 2
        f(n) = n
          
        n^logb(a) = n2          
          
        f(n) < n^logb(a)=n2
        Case 1:
          T(n) = θ(n2)
            
    e.g
      T(n) = 4 * T(n/2) + n2
      Solve:
        a = 4
        b = 2
        f(n) = n2
        n^logb(a) = n2
      
        f(n) = n^logb(a)=n2
        Case 2:
          T(n) = θ(n2 * (lgn))
          
    e.g
      T(n) = 4 * T(n/2) + n3
      Solve:
        a = 4
        b = 2
        f(n) = n3
        n^logb(a) = n2
      
        f(n) > n^logb(a)=n2
        Case 3:
        T(n) = θ(n3)

Leave a Reply

Your email address will not be published. Required fields are marked *

15 − four =