Question:
Discrete Mathematics: Proofs HELP?
2010-04-17 10:46:26 UTC
Suppose that a and b are odd integers with a not equal to b. Show there is a unique integer c such that |a-c| = |b-c|. THANKS
Three answers:
cyphre
2010-04-17 11:23:53 UTC
To explore this, what does absolute value of the difference between two numbers give you? The distance between the two numbers, right? So we are looking for a third number that is equidistant from both a and b. That would be the 'piggy-in-the-middle', the average of the two numbers, right?

Wait a minute! What if the average is not an integer? Dang! ..... But wait, OUR integers are spaced so that their average IS an integer!



So we need to show that c EXISTS, that c is an INTEGER and that c is UNIQUE.



EXISTENCE: Let a and b be two distinct odd integers, by definition of odd integer a=2x +1 an b=2y+1,

x not equal to y. Without loss of generality assume x>y so that a>b. To find c, lets solve the equation

|a-c| = |b-c|. We have to cases: polarity the same and polarity different.

Case 1) a - c = b - c ---> a - b = c - c which is a contradiction because a is not equal to b.

Case 2) a - c = -(b - c) ---> a - c = -b + c ----> a + b = 2 c ---> (a + b)/2 = c which means c is the average of a and b



C IS AN INTEGER Since c = (a+b)/2 ---> c = (2x+1 + 2y+1)/2= (2x+ 2y + 2)/2 = 2(x+y+1)/2=x+y+1

Since x, y, and 1 are integers x+y+1 is an integer.



C IS UNIQUE Any integer n fulfilling our requirement for existence would be defined as the average of a and b which means n = c, which is sufficient to declare c unique to the point of label.

ITTI (I tink tats it --- a kind of Norwegian QED)



I Hope this helps
holdm
2010-04-17 11:06:25 UTC
without loss of generality, assume b>a

set c = (a+b)/2. since a,b odd, c is an integer

|a-c| = c-a = (a+b)/2 - a = (b-a)/2

|b-c| = c-b = b - (a+b)/2 = (b-a)/2



stipulation that a != b not necessary; in this cae c = 0
2016-10-06 14:45:17 UTC
assume that n1 + n2 + n3 + n4 + n5 = a million. assume by utilising contradiction that there is not any ni (a million <= i <= 5) such that ni < a million/5. Then n1 + n2 + n3 + n4 + n5 < a million/5 + a million/5 + a million/5 + a million/5 + a million/5 = a million implying that n1 + n2 + n3 + n4 + n5 < a million yet this might nicely be a contradiction through fact all of us comprehend that n1 + n2 + n3 + n4 + n5 = a million. consequently, our assumption that there is not any ni such that ni < a million/5 is faulty. consequently, there exists ni such that ni >= a million/5.


This content was originally posted on Y! Answers, a Q&A website that shut down in 2021.
Loading...