Fundamentals of Statistics contains material of various lectures and courses of H. Lohninger on statistics, data analysis and chemometrics......click here for more.


Linear Dependence

Linear (in)dependence A given set of k vectors aj, is called linearly independent, if the equation s1a1 + s2a2 + ... + skak = o has no other solution than the trivial one (all scalars sj are zero). If any scalars sj different from zero exist, the set of vectors is called linearly dependent.

Linear independence is important for many aspects of data analysis. A general rule is that a set of n vectors of order m shows linear dependence if n is greater than m.

Linear independence is closely related to the rank of a matrix. If we recognize a matrix as a set of n (row or column) vectors, we immediately see that linear dependence among row or column vectors reduces the rank of the matrix.