10-22-2007, 10:45 PM
Suppose:
a = 1
b = -2
a + b = c
Now:
a + b = c
(a + b) . (a + b) = c . (a + b)
a² + b² + 2ab = ac + bc
a2 + ab - ac = - ab -b² + bc
a . (a + b - c) = -b (a + b - c)
a = -b
At the risk of being a party pooper, let me debunk this one. It's easier to see with numbers, so let me just substitute in the numbers (up to line v, after which I change things a bit).
Suppose:
a = 1
b = -2
a + b = c = -1
Now:
i) 1 + -2 = -1
ii) (1 + -2)(1 + -2) = -1(1 + -2)
iii) 1² + (-2)² + 2(1)(-2) = 1(-1) + -2(-1)
iv) 1² + 1(-2) - 1(-1) = -1(-2) -(-2)² + -2(-1)
v) 1(1 + -2 - -1) = -(-2)(1 + -2 - -1)
Good for us; we've arrived at 0 = 0. Now, the next line is where it falls apart:
vi) [1(1 + -2 - -1)] / (1 + -2 - -1) = [-(-2)(1 + -2 - -1)] / (1 + -2 - -1)
...which, if you cancel out the (1 + -2 - -1)'s, leads to the conclusion that 1 = 2. But dividing by (1 + -2 - -1) is an invalid operation, because (1 + -2 - -1) = 0.
Or, in other words, you're writing 0/0 = 0/0, which isn't anything because 0/0 is undefined. 8-)

