Tuesday, 22 October 2024

The Big O Cheat Sheet For Code Performance

Following on from my previous blog post about what the Big 0 is all about let’s dive into Big O notation and break down what it really means. This cheat sheet will help you understand the most common Big O terms you might encounter in technical interviews or algorithm discussions. We’ll walk through each one, explaining what they represent, and I’ll include some Python examples to make the concepts clearer

1. O(1): Constant Time

This is the holy grail. O(1) means no matter how big your input gets, the time to run the operation stays the same. It’s like looking up a value in a dictionary—no drama, no sweat, always the same time.

Example:


def get_first_element(items): return items[0] # Input size doesn't matter, always O(1) get_first_element([1, 2, 3, 4, 5]) # O(1)

2. O(n): Linear Time

O(n) means the time it takes to run your code increases directly with the size of the input. The bigger the list, the longer it takes. A simple for-loop over a list? Yep, that’s O(n). Welcome to average coding life.

Example:

def print_all_elements(items): for item in items: print(item) # The more items, the more work. O(n) print_all_elements([1, 2, 3, 4, 5]) # O(n)

3. O(n^2): Quadratic Time

O(n^2) is the “I’m about to make your life hell” of algorithms. This is what happens when you start nesting loops, and it doesn’t scale well. The time to execute grows exponentially as the input size increases.

Example:

def print_all_pairs(items): for i in items: for j in items: print(i, j) # If items has 5 elements, you're printing 25 pairs. O(n^2) print_all_pairs([1, 2, 3, 4, 5]) # O(n^2)

4. O(log n): Logarithmic Time

O(log n) is your performance-friendly pal. Think binary search—cutting the problem size in half at every step. You barely do any work while still looking smart.

Example:

def binary_search(items, target): low, high = 0, len(items) - 1 while low <= high: mid = (low + high) // 2 if items[mid] == target: return mid elif items[mid] < target: low = mid + 1 else: high = mid - 1 return -1 # Searching a sorted list is O(log n) binary_search([1, 2, 3, 4, 5, 6, 7, 8, 9], 7) # O(log n)

5. O(n log n): Linearithmic Time

O(n log n) is your fancy-sounding way to describe algorithms like merge sort or quicksort. It’s better than O(n^2), but not as slick as O(n). Still, not bad for a sorting algorithm.

Example:

def merge_sort(items): if len(items) <= 1: return items mid = len(items) // 2 left = merge_sort(items[:mid]) right = merge_sort(items[mid:]) return merge(left, right) def merge(left, right): sorted_list = [] i = j = 0 while i < len(left) and j < len(right): if left[i] < right[j]: sorted_list.append(left[i]) i += 1 else: sorted_list.append(right[j]) j += 1 sorted_list.extend(left[i:]) sorted_list.extend(right[j:]) return sorted_list # Merge Sort is O(n log n) merge_sort([5, 3, 8, 6, 2, 7, 4, 1]) # O(n log n)

6. O(2^n): Exponential Time

O(2^n) is when things start to spiral out of control. Your algorithm doubles in execution time with each additional element. Recursion-heavy algorithms, like the naïve Fibonacci calculation, live here—and it's not a fun neighborhood.

Example:

def fibonacci(n): if n <= 1: return n return fibonacci(n - 1) + fibonacci(n - 2) # Every new 'n' means twice the work. O(2^n) fibonacci(5) # O(2^n)

7. O(n!): Factorial Time

O(n!) is the algorithmic equivalent of lighting your server on fire. This is what happens when you brute-force permutations. It gets ugly fast. Think of trying to compute all possible orderings of a list.

Example:

import itertools def permutations(items): return list(itertools.permutations(items)) # Permutations are factorial. O(n!) permutations([1, 2, 3]) # O(n!)

Wrapping It Up

There you go—a quick rundown of Big O notations with Python examples. Will you need to use O(n^2) or O(log n) in every codebase? Probably not. But when someone asks, now you can spit out a couple of Big O terms, drop a “yeah, this could be optimized,” and move on like a pro.

No comments:

Post a Comment

SONiC: The Linux of Networking?

Alright, let’s talk about SONiC, which some folks in the industry are hyping as “the Linux of networking.” You know the drill: modular, ope...