Why Algorithms Is the Crown Jewel of GATE CSE
Along with Theory of Computation, Algorithms consistently ranks as the highest-value section in GATE CSE — generating 7–10 marks per paper and rewarding students who understand the structural logic of computation, not just code.
GATE algorithms questions don't ask you to write code. They ask you to trace what an algorithm does on a specific input, compare the behaviour of two algorithms, or compute the exact output after k steps. Understanding trumps memorisation.
Part 1: Algorithm Analysis — Complexity and Asymptotic Notation
Big-O, Omega, and Theta
- O(f(n)) — upper bound: the algorithm does at most this well (worst case)
- Ω(f(n)) — lower bound: the algorithm does at least this well (best case)
- Θ(f(n)) — tight bound: the algorithm is exactly this complexity
GATE question type: "The recurrence T(n) = 2T(n/2) + n has which of the following time complexity?" → Use Master Theorem.
Master Theorem
For T(n) = aT(n/b) + f(n):
Let k = log_b(a).
- If f(n) = O(n^(k-ε)) → T(n) = Θ(n^k) [Case 1]
- If f(n) = Θ(n^k) → T(n) = Θ(n^k · log n) [Case 2]
- If f(n) = Ω(n^(k+ε)) and regularity condition holds → T(n) = Θ(f(n)) [Case 3]
GATE uses the Master Theorem in almost every algorithms paper. Know all three cases and when each applies. Also know when Master Theorem doesn't apply (e.g., T(n) = 2T(n/2) + n/log n).
Part 2: Sorting Algorithms
GATE tests sorting on two axes: time complexity (best, average, worst) and stability.
| Algorithm | Best | Average | Worst | Stable? | In-Place? | |-----------|------|---------|-------|---------|-----------| | Bubble Sort | Ω(n) | Θ(n²) | O(n²) | Yes | Yes | | Selection Sort | Θ(n²) | Θ(n²) | O(n²) | No | Yes | | Insertion Sort | Ω(n) | Θ(n²) | O(n²) | Yes | Yes | | Merge Sort | Θ(n log n) | Θ(n log n) | O(n log n) | Yes | No | | Quick Sort | Ω(n log n) | Θ(n log n) | O(n²) | No | Yes | | Heap Sort | Θ(n log n) | Θ(n log n) | O(n log n) | No | Yes | | Counting Sort | Θ(n+k) | Θ(n+k) | O(n+k) | Yes | No |
Stability means equal elements maintain their original relative order after sorting. GATE frequently asks: "Which of the following sorting algorithms is NOT stable?" → Selection Sort, Quick Sort, Heap Sort.
Quick Sort worst case: Occurs when the pivot is always the smallest or largest element (e.g., sorted array with first element as pivot). This gives O(n²).
GATE question type: "What is the minimum number of comparisons needed to sort n elements using comparison-based sorting?" → Ω(n log n) by information-theoretic lower bound.
Part 3: Graph Algorithms
BFS and DFS
Both traverse all vertices: O(V + E) time, O(V) space.
BFS properties:
- Uses a queue (FIFO)
- Finds shortest path in unweighted graphs
- BFS tree gives shortest paths from source
- Can detect bipartiteness (check if any back edge connects same-level nodes)
DFS properties:
- Uses a stack (or recursion)
- Produces DFS tree with tree edges, back edges, forward edges, cross edges
- Back edges indicate cycles in directed graphs
- Topological sort = DFS with decreasing finish times
GATE question type: Given a graph and a starting vertex, trace BFS/DFS and determine the order of vertex discovery, or the DFS tree structure.
Shortest Path Algorithms
Dijkstra's Algorithm:
- Single-source shortest path for weighted graphs with non-negative weights
- Greedy: always processes the unvisited vertex with minimum tentative distance
- Time complexity: O(V²) with adjacency matrix, O((V+E) log V) with binary heap
- Fails with negative weights
Bellman-Ford Algorithm:
- Single-source shortest path, handles negative weights
- Time complexity: O(VE)
- Can detect negative-weight cycles (if a vertex is still relaxable after V-1 iterations)
Floyd-Warshall:
- All-pairs shortest path
- Dynamic programming: O(V³) time, O(V²) space
- Works with negative weights (not negative cycles)
GATE question: Given a weighted graph, trace Dijkstra from source vertex S and find the shortest path distance to vertex T after k iterations.
Minimum Spanning Tree
Prim's Algorithm: Greedy, starts from a vertex and expands by always adding the minimum weight edge connecting the MST to a non-MST vertex. O(E log V) with binary heap.
Kruskal's Algorithm: Sort all edges by weight; add the next smallest edge if it doesn't form a cycle (use Union-Find to check). O(E log E).
Both produce the same MST (or one of the MSTs if edge weights are equal).
Key property: MST contains exactly V-1 edges for a graph with V vertices.
Part 4: Dynamic Programming
DP is the most conceptually rich algorithms topic in GATE. Questions require identifying subproblems, writing recurrences, and computing optimal values.
Key DP Problems
Longest Common Subsequence (LCS): LCS(i,j) = LCS(i-1, j-1) + 1 if s1[i] == s2[j] = max(LCS(i-1,j), LCS(i,j-1)) otherwise
0/1 Knapsack: dp[i][w] = max(dp[i-1][w], value[i] + dp[i-1][w-weight[i]]) if weight[i] ≤ w = dp[i-1][w] otherwise
Matrix Chain Multiplication: Find the minimum number of scalar multiplications needed to compute a chain of matrix products. Classic DP with O(n³) time.
GATE approach: When you see a DP problem, identify: What are the subproblems? What's the recurrence? What's the base case? Then trace the table for the given input.
Part 5: Greedy Algorithms
Greedy algorithms make locally optimal choices at each step. Not all problems have greedy solutions — only those with the greedy choice property and optimal substructure.
Activity Selection: Choose maximum non-overlapping activities by always selecting the one that finishes earliest.
Huffman Coding: Build an optimal prefix-free binary code tree by repeatedly combining the two lowest-frequency symbols. Creates variable-length codes where more frequent symbols have shorter codes.
Fractional Knapsack: Unlike 0/1 Knapsack (which needs DP), fractional knapsack has a greedy solution: sort by value/weight ratio and fill greedily.