“Smart code isn’t just about working—it’s about working well, even when your input grows big.”
If you’re diving into data structures and algorithms (DSA), the terms time complexity and space complexity will keep popping up. This post will walk you through both—what they mean, why they matter, and how to reason about them—without drowning you in theory.
⏱️ What is Time Complexity?
Time complexity tells you how long an algorithm takes as the input size grows. It doesn’t measure real seconds, but how the number of steps grows based on input size n
.
We use Big O notation for this.
🔹 Common Time Complexities
Big O | Name | What it means |
---|---|---|
O(1) | Constant | Runs in same time no matter the input |
O(log n) | Logarithmic | Shrinks problem each time (e.g. binary search) |
O(n) | Linear | Scales with input size |
O(n log n) | Linearithmic | Often from smart sorting algorithms |
O(n²) | Quadratic | Nested loops (bad as n grows) |
O(2ⁿ), O(n!) | Exponential, Factorial | Grows insanely fast (avoid if possible) |
🔸 Example 1: Constant Time – O(1)
def get_first_item(arr):
return arr[0]
No matter if arr
has 10 or 10 million items—this always takes one step.
🔸 Example 2: Linear Time – O(n)
def print_all_items(arr):
for item in arr:
print(item)
As the array grows, the time taken grows proportionally.
🔸 Example 3: Logarithmic Time – O(log n)
def binary_search(arr, target):
low = 0
high = len(arr) - 1
while low <= high:
mid = (low + high) // 2
if arr[mid] == target:
return True
elif arr[mid] < target:
low = mid + 1
else:
high = mid - 1
return False
Each time we cut the array in half. That’s logarithmic behavior—super efficient for large data.
A Better Way to Understand O(log n)
Logarithmic time means your algorithm gets faster in relative terms as the input grows. Here’s a practical way to think about it:
Consider what happens when you double the size of your input:
- For an O(n) algorithm, the runtime also roughly doubles.
- For an O(log n) algorithm, the runtime increases by only a small constant.
That’s because:
log(2n) = log(2) + log(n)
So, doubling the input adds just one more step (log(2)), not double the work.
🔸 Example 4: Quadratic Time – O(n²)
def print_pairs(arr):
for i in arr:
for j in arr:
print(i, j)
If arr
has 5 items, this prints 25 pairs. 100 items? 10,000 pairs.
💾 What is Space Complexity?
Space complexity measures how much extra memory your algorithm uses.
This includes variables, data structures, recursive calls, etc.
You’re still using Big O notation here, but this time it’s about memory used.
🔸 Example: O(1) Space – No extra storage
def sum_array(arr):
total = 0
for num in arr:
total += num
return total
Just one variable: total
. Memory stays constant no matter the input.
🔸 Example: O(n) Space – Depends on input
def double_array(arr):
result = []
for num in arr:
result.append(num * 2)
return result
We store a new array the same size as input → linear space.
🧠 Time vs Space: A Tradeoff
Sometimes, you can speed things up by using more memory.
Example: Detecting duplicates
def has_duplicate(arr):
seen = set()
for item in arr:
if item in seen:
return True
seen.add(item)
return False
- Time: O(n)
- Space: O(n) → we use a set to remember what we’ve seen
Without the set, you’d need nested loops (O(n²) time), but no extra memory.
🧮 How to Analyze Code
When asked for time or space complexity, ask yourself:
- Loops? Count how many times they run (especially nested loops).
- Recursion? Consider how deep it goes.
- New data structures? Arrays, dicts, sets—how big do they grow?
📝 Cheat Sheet: Patterns to Remember
Pattern | Time Complexity | Space Complexity |
---|---|---|
Single loop | O(n) | O(1) |
Nested loops | O(n²) | O(1) |
Binary search | O(log n) | O(1) |
Recursive DFS | O(n) | O(n) (call stack) |
Hashing / Sets | O(n) | O(n) |
🔍 Real-Life Analogy
Think of algorithms like preparing for a trip.
- Time complexity is how long it takes to pack your bags.
- Space complexity is how much space your stuff takes in the suitcase.
You might pack faster if you throw everything in without folding (faster, more space), or pack slower but more efficiently (slower, less space). Similarly, some algorithms save time but use more memory—others do the opposite.
👋 Final Thoughts
Don’t overthink Big O at first. Focus on recognizing patterns and reasoning through how your code grows with input. You’ll get faster at this with time and practice.
Understanding time and space complexity helps you write efficient, scalable code—and crush those technical interviews along the way.
Thanks for reading! For more hands-on dev content, check out workdone0.github.io