Big O Notation Explained
Master Time and Space Complexity Analysis
What is Big O Notation?
Big O notation describes how an algorithm's runtime or space requirements grow as the input size increases. It helps us compare algorithms and predict performance at scale.
The "O" stands for "order of" — we care about the order of magnitude, not exact timings. Big O gives us the upper bound (worst-case scenario).
The Common Complexities (Best to Worst)
O(1) — Constant
Best case. Always takes same time regardless of input size.
function getFirstElement(arr) {
return arr[0]; // Always one operation
}
// n = 10: 1 op
// n = 1,000,000: still 1 opO(log n) — Logarithmic
Excellent. Halves the problem each step (binary search, balanced trees).
function binarySearch(arr, target) {
let left = 0, right = arr.length - 1;
while (left <= right) {
const mid = Math.floor((left + right) / 2);
if (arr[mid] === target) return mid;
if (arr[mid] < target) left = mid + 1;
else right = mid - 1;
}
return -1;
}
// n = 1,000,000: ~20 comparisons (log₂ 1,000,000)O(n) — Linear
Good. Operations grow proportionally with input size.
function findMax(arr) {
let max = arr[0];
for (let i = 1; i < arr.length; i++) { // Visit each element once
if (arr[i] > max) max = arr[i];
}
return max;
}
// n = 1,000: 1,000 operations
// n = 2,000: 2,000 operations (2x)O(n log n) — Linearithmic
Acceptable. Most efficient sorting algorithms (merge sort, quicksort average case).
function mergeSort(arr) {
if (arr.length <= 1) return arr;
const mid = Math.floor(arr.length / 2);
const left = mergeSort(arr.slice(0, mid)); // Divide: log n levels
const right = mergeSort(arr.slice(mid));
return merge(left, right); // Merge: n work per level
}
// n = 1,000: ~10,000 operations (1,000 × log₂ 1,000)O(n²) — Quadratic
Acceptable for small inputs. Nested loops over the same data.
function hasDuplicates(arr) {
for (let i = 0; i < arr.length; i++) { // Outer: n times
for (let j = i + 1; j < arr.length; j++) { // Inner: n times
if (arr[i] === arr[j]) return true;
}
}
return false;
}
// n = 100: 10,000 operations
// n = 1,000: 1,000,000 operations (100x slower!)O(2ⁿ) — Exponential
Bad. Doubles with each additional input. Naive recursive solutions.
function fibonacci(n) {
if (n <= 1) return n;
return fibonacci(n - 1) + fibonacci(n - 2); // 2 recursive calls each time
}
// n = 5: 15 calls
// n = 20: 21,891 calls
// n = 40: 2,692,537,000+ calls (minutes!)O(n!) — Factorial
Terrible. All permutations. Only works for tiny inputs (n ≤ 10).
function generatePermutations(arr) {
if (arr.length === 0) return [[]];
const result = [];
for (let i = 0; i < arr.length; i++) {
const rest = [...arr.slice(0, i), ...arr.slice(i + 1)];
const perms = generatePermutations(rest);
result.push(...perms.map(p => [arr[i], ...p]));
}
return result;
}
// n = 5: 120 permutations
// n = 10: 3,628,800 permutations
// n = 15: 1,307,674,368,000+ (impossible)How to Calculate Big O from Code
1. Count the Dominant Operations
function example(arr) {
let sum = 0; // O(1) - one assignment
for (let i = 0; i < arr.length; i++) { // O(n) - loop n times
sum += arr[i]; // O(1) per iteration
}
return sum; // O(1) - one return
}
// Total: O(1) + O(n) × O(1) + O(1) = O(n)
// We keep only the dominant term: O(n)2. Drop Constants and Lower Terms
function process(arr) {
for (let i = 0; i < arr.length; i++) { // O(n)
console.log(arr[i]);
}
for (let i = 0; i < arr.length; i++) { // O(n)
console.log(arr[i] * 2);
}
for (let i = 0; i < arr.length; i++) { // O(n)
for (let j = 0; j < arr.length; j++) { // O(n)
console.log(arr[i] + arr[j]);
}
}
}
// O(n) + O(n) + O(n²) = O(n² + 2n)
// Drop 2n (lower term): O(n²)3. Different Inputs = Different Variables
function compare(arr1, arr2) {
for (let i = 0; i < arr1.length; i++) { // O(n)
for (let j = 0; j < arr2.length; j++) { // O(m)
if (arr1[i] === arr2[j]) return true;
}
}
return false;
}
// O(n × m) NOT O(n²)
// Different arrays might have different sizes!Space Complexity
Big O also describes memory usage. We count: new data structures created, recursive call stack depth, copied arrays/objects.
O(1) Space — Constant
function sum(arr) {
let total = 0; // Only 1 variable, regardless of arr size
for (let num of arr) {
total += num;
}
return total;
}
// Space: O(1) — always 1 variableO(n) Space — Linear
function double(arr) {
const result = []; // New array grows with input
for (let num of arr) {
result.push(num * 2);
}
return result;
}
// Space: O(n) — result array has n elementsO(log n) Space — Logarithmic
function binarySearch(arr, target) {
// Iterative version
let left = 0, right = arr.length - 1;
while (left <= right) {
const mid = Math.floor((left + right) / 2);
if (arr[mid] === target) return mid;
if (arr[mid] < target) left = mid + 1;
else right = mid - 1;
}
return -1;
}
// Space: O(1) iterative
// But recursive version would be O(log n) due to call stack!Amortized Analysis
Sometimes an operation is usually fast but occasionally slow. We analyze the average cost over many operations.
// Dynamic array (like JavaScript array or Python list)
class DynamicArray {
constructor() {
this.data = new Array(1); // Start with capacity 1
this.length = 0;
}
push(value) {
if (this.length === this.data.length) {
// Occasionally resize: O(n)
const newData = new Array(this.data.length * 2);
for (let i = 0; i < this.length; i++) {
newData[i] = this.data[i];
}
this.data = newData;
}
this.data[this.length++] = value; // Usually: O(1)
}
}
// Resizes at: 1, 2, 4, 8, 16, 32, 64, 128, ...
// 100 pushes: ~7 resizes copying 127 elements total
// Amortized: O(1) per push (not O(n)!)Common Mistakes to Avoid
❌ Thinking O(2n) is different from O(n)
Drop constants! Both are O(n).
❌ Confusing O(n + m) with O(n²)
Different inputs = different variables. Don't assume n = m.
❌ Ignoring space complexity
Interviews often ask "Can you do this in O(1) space?"
❌ Not considering best/average/worst cases
Quicksort: O(n log n) average, O(n²) worst. Be specific.
Practice: Identify the Complexity
function mystery1(n) {
for (let i = 1; i < n; i *= 2) {
console.log(i);
}
}Show Answer
O(log n) — i doubles each iteration (1, 2, 4, 8, ...), so we do log₂ n iterations.
function mystery2(arr) {
for (let i = 0; i < arr.length; i++) {
for (let j = 0; j < 100; j++) {
console.log(arr[i]);
}
}
}Show Answer
O(n) — Inner loop is constant 100 iterations. O(n × 100) = O(n).
function mystery3(n) {
if (n <= 1) return 1;
return mystery3(n - 1) + mystery3(n - 1);
}Show Answer
O(2ⁿ) — Each call makes 2 recursive calls, creating a binary tree of depth n.
Key Takeaways
- ✅ Big O describes growth rate, not exact time
- ✅ Drop constants and lower terms: O(3n² + 5n + 2) → O(n²)
- ✅ Different inputs = different variables: O(n × m) ≠ O(n²)
- ✅ Consider both time AND space complexity
- ✅ Know the hierarchy: O(1) < O(log n) < O(n) < O(n log n) < O(n²) < O(2ⁿ) < O(n!)
- ✅ Nested loops often (but not always!) mean O(n²)
- ✅ Recursion call stack counts toward space complexity
Next Steps
Now that you understand Big O, practice analyzing real algorithms:
- Easy: Two Pointers, Sliding Window (mostly O(n))
- Medium: Binary Search (O(log n)), Hash Maps (O(n) or O(n²))
- Hard: Dynamic Programming (often O(n²) or O(n³))
Use our practice section to see Big O in action with real interview questions!
Continue Learning
- Two Pointers Pattern Explained — Master this O(n) technique for array problems
- Sliding Window Technique — Another essential O(n) pattern
- Hash Maps: When and Why — Understand O(1) lookups and their trade-offs
- Start Practicing — Apply Big O analysis to real problems