Want to know how?
Let a=b.
Multiply both sides by a.
Now we have a^2=ab
Subtract b^2 from both sides.
Now we have a^2-b^2=ab-b^2
Factor.
(a+b)(a-b)=b(a-b)
Divide by (a-b).
a+b=b
But a=b, so
b+b=b
Simplify
2b=b
Divide by b
2=1
You’re welcome.
Edit: Congrats to LahDsai to finding the error. Yes, the proof technically divides by zero. Not sarcastic at all, nice job. Took me a little bit of my own time to find the error.
-
[quote]Let a=b. Multiply both sides by a. Now we have a^2=ab Subtract b^2 from both sides. Now we have a^2-b^2=ab-b^2 Factor. (a+b)(a-b)=b(a-b) Divide by (a-b).[/quote] Let's stop right here. As you stated before, a=b. Therefore, a-b=0. So what you have is 2/0=1/0 which is true as anything divided by 0 is indeterminate (indeterminate = indeterminate). Congratulations, the equation is now broken and any future math within this equation is meaningless. So no, 2≠1