You are not logged in.

- Topics: Active | Unanswered

Pages: **1**

**ryos****Member**- Registered: 2005-08-04
- Posts: 394

Will you please check this proof? It feels like I'm cheating.

Prove that two vectors, **a** and **b** are linearly dependent if and only if **a** is a scalar multiple of **b**.

Proof:

If **a** = k**b**, then **a** - k**b** = 0, and the system is linearly dependent.

If, however, **a** ≠ k**b**, then **a** - k**b** ≠ 0, and the system is not linearly dependent.

My book gives the hint that we should consider separately the case where **a** = **0** (the zero vector), but that just seems superfluous and unnecessary to me.

What do you guys think?

El que pega primero pega dos veces.

Offline

**Dross****Member**- Registered: 2006-08-24
- Posts: 325

What definition of linearly dependant have you been given? As far as I can remember, **a** and **b** are linearly dependant iff there exists scalars *a* and *b* such that *a***a**+*b***b** = 0.

The first part of your proof correctly shows that **a** being a scalar multiple of **b** implies that such scalars exist, and so **a** and **b** are linearly dependant. The second part of your proof, I think, can be made more rigorous by saying that **a** is not a scalar multiple of **b** implies that there exists no *k* such that **a**=*k***b**, so the scalars required for linear dependance do not exist.

All in all though, I think your proof is valid.

Bad speling makes me [sic]

Offline

**ryos****Member**- Registered: 2005-08-04
- Posts: 394

Thanks; I got full credit.

El que pega primero pega dos veces.

Offline

Pages: **1**