A mathematical fallacy is a series of steps which are seemingly correct but contain a flawed argument to prove an obvious contradiction or absurdity, such as that 1 = 2. Fallacies differ from simple mistakes in that there is an element of concealment in the presentation of the proof, perhaps hidden by the algebraic notation being used; not only do they lead to an absurd result, they do so in a deliberately deceptive way.[1]

Example


The following appears to prove that 1 = 2.

Let a=b:
a\times{a}=b\times{a} (Multiply both sides by a)
a^2=ab
a^2-b^2=ab-b^2 (Subtract b^2 from both sides)
(a+b)(a-b)=b(a-b)
a+b=b (Divide both sides by (a-b)
a+a=a (Since b=a)
2a=a
2=1 (Divide both sides by a)

The deception occurs in the fifth step of the “proof”, the division of both sides of the equation by (a-b). As the initial premise is that a=b then that is a division by zero, which is an undefined operation; \frac{0}{0} \neq 1.

References



Bibliography