Big O Notation Cheat Sheet
Complete reference for time and space complexity with examples and visual comparison.
📊 Complexity Growth Comparison
🎯 Time Complexities
Time remains constant regardless of input size.
Time grows logarithmically. Very efficient for large inputs.
Time grows linearly with input size.
Common for efficient sorting algorithms.
Time grows quadratically. Avoid for large inputs.
Very slow. Only for small inputs.
Extremely slow. Doubles with each additional input.
Worst complexity. Only viable for tiny inputs.
📚 Data Structure Complexities
| Data Structure | Access | Search | Insert | Delete |
|---|---|---|---|---|
| Array | O(1) | O(n) | O(n) | O(n) |
| Stack | O(n) | O(n) | O(1) | O(1) |
| Queue | O(n) | O(n) | O(1) | O(1) |
| Linked List | O(n) | O(n) | O(1) | O(1) |
| Hash Table | N/A | O(1) | O(1) | O(1) |
| Binary Search Tree | O(log n) | O(log n) | O(log n) | O(log n) |
| Heap | O(1) | O(n) | O(log n) | O(log n) |
Understanding Big O Notation
Big O notation is a mathematical notation that describes the limiting behavior of a function when the argument tends towards a particular value or infinity. In computer science, it's used to classify algorithms according to how their run time or space requirements grow as the input size grows.
This cheat sheet provides a comprehensive overview of common time complexities, from the most efficient O(1) constant time to the least efficient O(n!) factorial time. Understanding these complexities is essential for writing efficient code and passing technical interviews.
Why Big O Matters
Big O notation allows you to compare algorithms objectively. An O(n log n) sorting algorithm will always outperform an O(n²) algorithm for sufficiently large inputs. This understanding helps you choose the right data structures and algorithms for your specific use case.
Common Complexity Classes
O(1) operations like array access are instant. O(log n) operations like binary search halve the problem with each step. O(n) operations touch each element once. O(n²) often indicates nested loops and should be avoided when possible.
Space vs Time Complexity
Algorithms can be optimized for time (speed) or space (memory). Sometimes you can trade one for the other. Dynamic programming often uses extra space to achieve better time complexity. Understanding this trade-off is crucial for system design.
Interview Tip
In coding interviews, always analyze the time and space complexity of your solution. Interviewers expect you to identify the Big O of your code and discuss potential optimizations. Practice recognizing patterns that indicate specific complexities.