Dijkstra's Algorithm
#programming #algorithms #graphs
Dijkstra's algorithm is a popular path-finding algorithm, commonly employed to find
single-source shortest paths in a graph with non-negative edge weights.
So, given a weighted graph
It finds shortest distance
Greedy approach
It always selects the nearest unvisited node to process next, assuming that this local choice will lead to a globally optimal solution
This only works because of the non-negative weights. Assumes that path distance can only increase as more nodes are added. Once you reach a node via shortest path, it is guaranteed that no future path can improve it.
Coding approach
maintains current best distance from source for vertex . - Keeps a track of vertices for which distance has already been computed, using a priority queue or matrix.
- For every vertex essentially,
maintains an approximation, the "current best" path to it. During processing, it updates if shorter path is found. - A
that marks if the node has been processed, also signifying whether it's distance is finalized. A node is not marked visited until read from the priority queue, at which it is guaranteed that everything leading up to this node is accounted for, and therefore at that time must store the best possible path.
Algorithm Psuedo-code
Initialize:
dist[v] = ∞ for all v
dist[s] = 0
priority queue pq
push (0, s)
while pq not empty:
(d, u) = pq.pop_front() // min priority = closest
if u already visited: continue
mark u visited
for each edge (u → v) with weight w:
if dist[v] > dist[u] + w:
dist[v] = dist[u] + w
push (dist[v], v)
This essentially ensures that all nodes are all processed in the order of their distances from
Why greedy works
Let's assume that we pop a node from the priority queue. This means that
Assume : That
This means that there must be some node
But it also means that
Essentially means that node
This means that node
But that contradicts our initial assumption
Hence, proved by contradiction, greedy actually works.
Time complexity analysis
We have two major operations that contribute to time-complexity.
- Extract min : Pick the vertex with smallest current
- Relaxation : Update or relax distances of adjacent vertices, after traveling each node and calculating only minimum distances from the previous nodes each time.
Using array for storing
check for finding minimum distance. - Repeated for
vertices, so contribution is - Updating edges :
, since constant time per edge, and order is of edges.
Total :
Best for dense graphs when
Binary heap
: extracting min from current heap; Repeated times : Update existing heap (heapify) after edge relaxation; repeated times
Total :
Fibonacci heap
Theoretical best
- Find min :
- Decrease key :
armortized
Total :
Better when
It has, in general, large constant factors, therefore rarely ever used in practice.