Big O Notation
Big O describes how an algorithm's time or space usage scales with input size. It lets you compare algorithms independent of hardware — an O(log n) solution beats O(n) at any scale large enough to matter.
Sorting Algorithms
Different sorting algorithms trade off simplicity, memory, and worst-case guarantees. Merge sort and heap sort guarantee O(n log n); quicksort is fast in practice but degrades to O(n²) on bad pivots.
| Algorithm | Time / Space |
|---|---|
| Bubble Sort | O(n²) / O(1) — simple, rarely used in practice |
| Merge Sort | O(n log n) / O(n) — stable, predictable |
| Quick Sort | O(n log n) avg / O(log n) — fastest in practice |
| Heap Sort | O(n log n) / O(1) — in-place, not stable |
Searching Algorithms
Linear search scans every element — O(n), no preconditions. Binary search halves the search space each step — O(log n), but requires a sorted array. Always sort first if you'll search repeatedly.
function binarySearch(arr, target) {
let lo = 0, hi = arr.length - 1;
while (lo <= hi) {
const mid = (lo + hi) >> 1;
if (arr[mid] === target) return mid;
if (arr[mid] < target) lo = mid + 1;
else hi = mid - 1;
}
return -1;
}Common Patterns
Most algorithm problems map to a handful of recurring patterns. Recognizing the pattern — not memorizing solutions — is the skill that transfers to new problems.
When to Optimize
Premature optimization wastes time. Measure first with real data, then target the hottest path — usually a data structure switch or algorithm class change delivers far more than micro-tuning.