Greedy algorithms focus on making the best local choice at each step, aiming for a global optimum. These examples, like the Coin Change Problem and Dijkstra's Algorithm, showcase how this approach efficiently solves various optimization challenges in data structures.
-
Coin Change Problem
- Aim: To find the minimum number of coins needed to make a specific amount of money.
- Greedy Choice: Always choose the largest denomination coin available that does not exceed the remaining amount.
- Optimal Substructure: The problem can be broken down into smaller subproblems, where the solution to the overall problem depends on the solutions to these subproblems.
- Limitations: The greedy approach works optimally only for certain coin denominations (e.g., U.S. coins).
-
Fractional Knapsack Problem
- Aim: To maximize the total value in a knapsack with a weight limit by allowing fractional items.
- Greedy Choice: Select items based on the highest value-to-weight ratio until the knapsack is full.
- Continuous Items: Unlike the 0/1 knapsack problem, items can be divided, allowing for partial inclusion.
- Optimal Solution: The greedy method guarantees an optimal solution for this problem.
-
Activity Selection Problem
- Aim: To select the maximum number of activities that don't overlap in time.
- Greedy Choice: Always choose the next activity that finishes the earliest and is compatible with previously selected activities.
- Sorting Requirement: Activities must be sorted by their finish times to apply the greedy approach effectively.
- Optimal Substructure: The solution can be built incrementally by selecting compatible activities.
-
Huffman Coding
- Aim: To compress data by assigning variable-length codes to input characters based on their frequencies.
- Greedy Choice: Build a binary tree by repeatedly merging the two least frequent nodes until only one tree remains.
- Prefix Codes: The resulting codes are prefix-free, ensuring no code is a prefix of another, which allows for unambiguous decoding.
- Optimality: Produces the most efficient coding scheme for a given set of character frequencies.
-
Dijkstra's Shortest Path Algorithm
- Aim: To find the shortest path from a source node to all other nodes in a weighted graph.
- Greedy Choice: Always expand the least costly node that has not yet been processed.
- Priority Queue: Utilizes a priority queue to efficiently select the next node to process based on the shortest known distance.
- Optimal Solution: Guarantees the shortest path in graphs with non-negative weights.
-
Prim's Minimum Spanning Tree Algorithm
- Aim: To find a minimum spanning tree for a connected, weighted graph.
- Greedy Choice: Start with a single vertex and repeatedly add the smallest edge that connects a vertex in the tree to a vertex outside the tree.
- Priority Queue: Uses a priority queue to efficiently select the next edge with the minimum weight.
- Optimality: Guarantees a minimum spanning tree for the graph.
-
Kruskal's Minimum Spanning Tree Algorithm
- Aim: To find a minimum spanning tree for a connected, weighted graph.
- Greedy Choice: Sort all edges by weight and add the smallest edge to the tree, ensuring no cycles are formed.
- Union-Find Structure: Utilizes a union-find data structure to efficiently manage and detect cycles.
- Optimality: Guarantees a minimum spanning tree for the graph.
-
Job Sequencing Problem
- Aim: To maximize profit by scheduling jobs within deadlines.
- Greedy Choice: Sort jobs by profit in descending order and schedule them in the latest available time slot before their deadline.
- Feasibility Check: Ensure that each job can be scheduled without overlapping with previously scheduled jobs.
- Optimal Solution: The greedy approach yields an optimal solution for maximizing profit.
-
Interval Scheduling
- Aim: To select the maximum number of non-overlapping intervals from a set.
- Greedy Choice: Always select the interval that finishes the earliest and is compatible with previously selected intervals.
- Sorting Requirement: Intervals must be sorted by their finish times to apply the greedy approach effectively.
- Optimal Substructure: The solution can be built incrementally by selecting compatible intervals.
-
Huffman Decoding
- Aim: To decode a sequence of bits into the original characters using Huffman codes.
- Tree Traversal: Use the Huffman tree to traverse based on the bit sequence, moving left for '0' and right for '1'.
- Prefix Property: The unique prefix property of Huffman codes ensures that each bit sequence corresponds to exactly one character.
- Efficiency: Allows for efficient decoding of compressed data back into its original form.