Discussion about math, puzzles, games and fun.   Useful symbols: ÷ × ½ √ ∞ ≠ ≤ ≥ ≈ ⇒ ± ∈ Δ θ ∴ ∑ ∫ • π ƒ -¹ ² ³ °

You are not logged in.

## Post a reply

Write your message and submit
|
Options

## Topic review (newest first)

Ricky
2006-04-05 14:12:59

If d | n, then n is a multiple of d, and n is in S.

Kazy
2006-04-05 12:36:14

for the 2nd part, I see that d | n but I don't see how d | n can show that S is all the integer multiples of d

Kazy
2006-04-05 12:10:39

Thanks! The book is called "Introduction to Abstract Mathematics", the class is titled "Intro to logic for secondary mathematics". Thanks for all your help.. i took this class as an elective thinking it would be easier than most of the other upper level math classes, but its turned into a real pain in the math.

Ricky
2006-04-05 11:58:51

For the second one, you use the fact that if d = gcd(a, b) and n = ax + by for some n in Z, then d | n.  It should be straightforward from there.

And Kazy, what math are you in?  You been asking for proofs from abstract algebra, set theory, number theory, and probably others that I forget.  I'm just kind of curious.

Ricky
2006-04-05 11:55:36

To prove it's unique, let d = gcd(a, b), and then assume there is another gcd, e.  It should be fairly easy from here, because one property of the gcd is that it is the greatest common divisor of a and b.  Can you have two greatests?

Kazy
2006-04-05 09:56:29

Given this theorem:
Let a,b ∈ Z, not both equal. Then a greatest common divisor d of a and b exists and is unique. Moreover, there exist integers x and y such that d = ax + by.

I need to prove the unique part of the theorem. I figured out the proof of the existance, but I have no idea how ot prove that the gcd of a and b is unique.

After that, I also need to prove the following:
Let a, b ∈ Z, not both zero. Let S = {n∈Z | n=ax + by, for some x, y ∈ Z}. Let d = (a,b) where (a,b) is the g.c.d of a and b. Prove that S is the set of all integer multiples of d.

I used the set S in the previous part to prove the first part of the theorem, and I'm guessing i need to use the theorem and manipulate it somehow to prove that S is the set of all integer multiples of d.

Can anyone help?