Table of contents
Open Table of contents
tl; dr
An algorithm is a computational method that terminates: for every input , there exists some such that . In other words, repeatedly applying starting from any valid input eventually reaches a terminal state.
Computational Method vs. Algorithm
Knuth defines a computational method as a quadruplet where:
- is the set of all possible states the computation can be in
- is the set of input (starting) states
- is the set of output (terminal) states
- is the transition function, where for all
The transition function maps each state to the next. Terminal states map to themselves—once an output is reached,
The Five Properties
Knuth defines an algorithm as a computational method possessing five properties:
1. Finiteness: an algorithm must terminate after a finite number of steps (although the number of steps can grow arbitrarily large).
2. Definiteness: each step must be precise and unambigious.
3. Input: it must have zero or more input values. These can be supplied initially before the algorithm begings, or dynamically as it runs.
4. Output: it must produce one or more outputs: quantities that have a specified relation to the inputs.
5. Effectiveness: each operation must be basic enough for a human to do exactly, on paper, in a finite amount of time.
Why?
By defining an algorithm as a precise structure, we are able to analyse various algorithms’ time and space complexities, understand tradeoffs between different algorithms for the same problem, and ensuring that the algorithm will reach a solution in a finite amount of time.
Focusing on the logic of the algorithm over the details of a specific programming language also aids in understanding why and where a technique is best suited, rather than just learning to type it out.
A computational method that does not terminate for every input is not an algorithm. It is something weaker, and the distinction matters when later proving algorithms correct as it is necessary to show termination separately from correctness.
Footnotes
1) Knuth specifically warns against instructions like “divide by and let be the remainder” when or could be zero—the step in question is undefined for that input.
2) This rules out operations like “if is irrational, then…” because a human cannot determine irrationality in finite time for arbitrary .