A mathematical fallacy is a series of steps which are seemingly correct but contain a flawed argument to prove an obvious contradiction or absurdity, such as that 1 = 2. Fallacies differ from simple mistakes in that there is an element of concealment in the presentation of the proof, perhaps hidden by the algebraic notation being used; not only do they lead to an absurd result, they do so in a deliberately deceptive way.[1]
Example
The following appears to prove that 1 = 2.
Let :
(Multiply both sides by
)
(Subtract
from both sides)
(Divide both sides by
(Since
)
(Divide both sides by
)
The deception occurs in the fifth step of the “proof”, the division of both sides of the equation by . As the initial premise is that
then that is a division by zero, which is an undefined operation;
.