Core Algorithms and Patterns for Programming Interviews
July 20, 2025
Table of Contents
Basic Algorithmic Techniques
Two Pointers
Definition: An algorithmic pattern where two pointers iterate through a data structure until one or both have met a certain condition.
Time Complexity: Typically O(N).
Space Complexity: Typically O(1).
Python Example: This example reverses a list in-place using two pointers.
def reverse_list(arr):
left, right = 0, len(arr) - 1
while left < right:
arr[left], arr[right] = arr[right], arr[left]
left += 1
right -= 1
return arr
my_list = [1, 2, 3, 4, 5]
print(f"Reversed: {reverse_list(my_list)}")
# Expected Output: Reversed: [5, 4, 3, 2, 1]
Sliding Window
Definition: A specific type of two-pointer technique where the pointers define a "window" over a contiguous part of the data (e.g., a subarray or substring), which is adjusted to solve the problem.
Time Complexity: Typically O(N).
Space Complexity: Often O(k) or O(1), where k is the size of the window or the number of distinct elements in the window.
Python Example: Finds the maximum sum of any contiguous subarray of size k.
def max_sum_subarray(arr, k):
max_val = float('-inf')
current_sum = 0
start = 0
for end in range(len(arr)):
current_sum += arr[end]
if end >= k - 1:
max_val = max(max_val, current_sum)
current_sum -= arr[start]
start += 1
return max_val
print(f"Max sum of subarray size 3: {max_sum_subarray([2, 1, 5, 1, 3, 2], 3)}")
# Expected Output: Max sum of subarray size 3: 9
Sorting
Definition: The process of arranging items in a sequence ordered by some criterion.
Time Complexity:
- Comparison Sorts (Merge Sort, Quicksort): O(NlogN) average and worst case (for Merge Sort). Quicksort is O(N²) in the worst case.
- Non-Comparison (Counting Sort, Radix Sort): O(N+k), where k is the range of input values.
Space Complexity:
- Merge Sort: O(N)
- Quicksort: O(logN) average, O(N) worst case.
- Counting Sort: O(k)
Python Example: Uses Python's highly optimized built-in sorting.
my_list = [3, 1, 4, 1, 5, 9, 2, 6]
my_list.sort() # In-place sort
print(f"Sorted list: {my_list}")
# Expected Output: Sorted list: [1, 1, 2, 3, 4, 5, 6, 9]
Core Search & Traversal
Binary Search
Definition: An efficient algorithm for finding an item from a sorted list of items. It works by repeatedly dividing in half the portion of the list that could contain the item.
Time Complexity: O(logN).
Space Complexity: O(1) for iterative, O(logN) for recursive due to the call stack.
Python Example: An iterative implementation of binary search.
def binary_search(arr, target):
left, right = 0, len(arr) - 1
while left <= right:
mid = (left + right) // 2
if arr[mid] == target: return mid
elif arr[mid] < target: left = mid + 1
else: right = mid - 1
return -1 # Not found
sorted_arr = [2, 5, 7, 8, 11, 12]
print(f"Index of 11: {binary_search(sorted_arr, 11)}")
# Expected Output: Index of 11: 4
print(f"Index of 6: {binary_search(sorted_arr, 6)}")
# Expected Output: Index of 6: -1
Breadth-First Search (BFS)
Definition: A traversal algorithm for trees or graphs that explores neighbor nodes first, before moving to the next level neighbors.
Time Complexity: O(V+E), where V is vertices and E is edges.
Space Complexity: O(W), where W is the maximum width of the graph/tree (can be O(V) in the worst case).
Python Example: Performs a BFS traversal on a graph represented by an adjacency list.
from collections import deque
def bfs(graph, start_node):
visited, queue, result = {start_node}, deque([start_node]), []
while queue:
node = queue.popleft()
result.append(node)
for neighbor in graph.get(node, []):
if neighbor not in visited:
visited.add(neighbor)
queue.append(neighbor)
return result
graph = {'A': ['B', 'C'], 'B': ['D'], 'C': [], 'D': []}
print(f"BFS from A: {bfs(graph, 'A')}")
# Expected Output: BFS from A: ['A', 'B', 'C', 'D']
Depth-First Search (DFS)
Definition: A traversal algorithm for trees or graphs that explores as far as possible along each branch before backtracking.
Time Complexity: O(V+E).
Space Complexity: O(H), where H is the maximum height/depth of the graph/tree (can be O(V) in the worst case for skewed trees/paths).
Python Example: Recursive DFS implementation.
def dfs_recursive(graph, start_node, visited=None):
if visited is None:
visited = set()
visited.add(start_node)
result = [start_node]
for neighbor in graph.get(start_node, []):
if neighbor not in visited:
result.extend(dfs_recursive(graph, neighbor, visited))
return result
graph = {'A': ['B', 'C'], 'B': ['D'], 'C': [], 'D': []}
print(f"DFS from A: {dfs_recursive(graph, 'A')}")
# Expected Output: DFS from A: ['A', 'B', 'D', 'C']
Interview Preparation Strategy
Two Pointers Technique
Concepts: Fast and slow pointers, left and right pointers, maintaining invariants.
Why: Efficiently solve problems that require comparing or processing elements from different positions in a sequence.
Key Problems:
- Two Sum II - Input Array Is Sorted [#167] (sorted array)
- Container With Most Water [#11] (area calculation)
- 3Sum [#15] (three pointers)
- Remove Duplicates from Sorted Array [#26] (in-place)
Sliding Window
Concepts: Fixed-size vs. variable-size windows, maintaining window invariants, expanding/contracting.
Why: Efficiently solve substring/subarray problems with O(N) time complexity.
Key Problems:
- Longest Substring Without Repeating Characters [#3] (variable window)
- Minimum Size Subarray Sum [#209] (variable window)
- Longest Repeating Character Replacement [#424] (variable window)
- Sliding Window Maximum [#239] (monotonic queue)
Binary Search
Concepts: Search space reduction, finding insertion points, binary search on answer.
Why: Efficiently find elements in sorted data or solve optimization problems.
Key Problems:
- Search in Rotated Sorted Array [#33] (modified binary search)
- Find First and Last Position of Element in Sorted Array [#34] (boundary search)
- Median of Two Sorted Arrays [#4] (hard, binary search on answer)
- Split Array Largest Sum [#410] (binary search on answer)
Graph Traversal
Concepts: BFS vs. DFS choice, visited tracking, cycle detection, topological sorting.
Why: Many problems can be modeled as graph traversal, especially grid problems and dependency resolution.
Key Problems:
- Number of Islands [#200] (DFS/BFS on grid)
- Clone Graph [#133] (BFS with hash map)
- Course Schedule [#207] (topological sort)
- Word Ladder [#127] (BFS with word transformation)
Dynamic Programming Patterns
Concepts: Memoization vs. tabulation, state definition, transition functions.
Why: Solve optimization problems by breaking them into overlapping subproblems.
Key Problems:
- Climbing Stairs [#70] (1D DP)
- House Robber [#198] (1D DP with constraints)
- Longest Increasing Subsequence [#300] (1D DP with binary search)
- Coin Change [#322] (unbounded knapsack)
Greedy Algorithms
Concepts: Local optimal choices, proof of correctness, when to use.
Why: Efficiently solve optimization problems where local optimal choices lead to global optimum.
Key Problems:
- Jump Game [#55] (greedy with reachability)
- Gas Station [#134] (greedy with circuit)
- Task Scheduler [#621] (greedy with frequency)
- Meeting Rooms II [#253] (greedy with sorting)
Backtracking
Concepts: State space exploration, pruning, constraint satisfaction.
Why: Solve problems that require exploring all possible combinations or permutations.
Key Problems:
- Subsets [#78] (backtracking with choices)
- Permutations [#46] (backtracking with swaps)
- N-Queens [#51] (backtracking with constraints)
- Sudoku Solver [#37] (backtracking with validation)
Advanced Techniques
Monotonic Stack/Queue
- Next Greater Element [#496] (monotonic stack)
- Sliding Window Maximum [#239] (monotonic queue)
Union-Find
- Number of Connected Components in an Undirected Graph [#323]
- Redundant Connection [#684]
Trie (Prefix Tree)
- Implement Trie (Prefix Tree) [#208]
- Word Search II [#212] (hard, backtracking + trie)
Segment Tree / Binary Indexed Tree
- Range Sum Query - Mutable [#307]
- Count of Smaller Numbers After Self [#315]
Remember: The goal isn't just to memorize implementations, but to understand the underlying principles and trade-offs that make each algorithm suitable for specific problems. Practice recognizing patterns and choosing the right algorithmic approach based on problem constraints and requirements.
For a comprehensive guide to data structures, check out Core Data Structures for Programming Interviews.