# Linear dependence

Bases, such as comma bases, are considered linearly dependent when they share a common basis vector. In other words, that they can form an identical vector through linear combinations of their member vectors.

When basis vector sets do not share a common basis vector like this, they are linearly independent. Linearly dependent basis vector sets are in a sense more closely related to each other than linearly independent basis vector sets.

Linear dependence is involved in certain operations used in regular temperament theory, such as the wedge product or temperament addition, which are defined for objects that can be interpreted as basis vector sets, such as matrices or multivectors, and that also represent regular temperaments.

## Linear dependence as defined for various types of vector sets

Linear dependence is defined for several objects relevant to RTT that can be defined as basis vector sets. These objects will each be discussed in detail below.

### Linear dependence between basis matrices

Linear dependence is defined on sets of basis matrices (matrices acting as bases), such as two temperaments' mappings, or two temperaments' comma bases.

A set of basis matrices are linear dependent upon each other when some vector can be found where each basis matrix can produce this vector through a linear combination of its own constituent basis vectors. For a very simple example, the mappings [5 8 12] 7 11 16] and [7 11 16] 15 24 35] are linearly dependent because both mappings contain the vector 7 11 16]. For a less obvious example, the mappings [1 0 -4] 0 1 4] and [1 2 3] 0 3 5] are also linearly dependent, because the vector 7 11 16] can be found through linear combinations of each of their rows; in the first mapping's case, 7 11 16] = 71 0 -4] + 110 1 4], and in the second mapping's case, 7 11 16] = 71 2 3] + -10 3 5].

Sometimes basis matrices can share not just one basis vector, but multiple basis vectors. For example, the comma basis [-30 19 0 0 [-26 15 1 0 [-17 9 0 1] and the comma basis [-19 12 0 0 [-15 8 1 0 [-6 2 0 1] share both the vector [4 -4 1 0 as well as the vector [13 -10 0 1:

• [4 -4 1 0 = [-26 15 1 0 - [-30 19 0 0
• [4 -4 1 0 = [-15 8 1 0 - [-19 12 0 0
• [13 -10 0 1 = [-17 9 0 1 - [-30 19 0 0
• [13 -10 0 1 = [-6 2 0 1 - [-19 12 0 0

These two basis matrices are the comma bases dual to the 7-limit uniform maps for 12-ET and 19-ET, respectively. [4 -4 1 0 is the meantone comma and [13 -10 0 1 is Harrison's comma, so we can say that both of these temperaments temper out both of these commas.

#### For a given set of basis matrices, how to compute a basis for their linearly dependent vectors

A basis for the linearly dependent vectors of a set of basis matrices, or in other words, a linear-dependence basis $L_{\text{dep}}$ can be computed using temperament merging.

• To check if two mappings are linearly dependent, we use a comma-merge. That is, we take the dual of each mapping to find its corresponding comma basis. Then we concatenate these two comma bases into one bigger comma basis. Finally, we take the dual of this comma basis to get back into mapping form. If this result is an empty matrix, then the mappings are linearly independent, and otherwise the mappings are linearly dependent and the result gives their linear-dependence basis.
• To check if two comma bases are linearly dependent, we use a map-merge. This process exactly parallels the process for checking two mappings for linear dependence. Take the duals of the comma bases to get two mappings, concatenate them into a single mapping, and take the dual again to get back to comma basis form. If the result is an empty matrix, the comma bases are linearly independent, and otherwise they are linearly dependent and the result gives a their linear-dependence basis.

Certainly there are other ways to determine linear dependency, but this method is handy because if the basis matrices are linearly dependent, then it also gives you $L_{\text{dep}}$.

### Linear dependence between multivectors

Linear dependence is defined for sets of multivectors, such as two temperaments' multimaps, or two temperaments' multicommas. For more information, see Douglas Blumeyer and Dave Keenan's Intro to exterior algebra for RTT#Linear dependence between multivectors.

### Linear dependence within a single basis matrix

Linear dependence is defined among the basis vectors of a single basis matrix. For more information, see rank-deficiency and full-rank.

### Linear dependence between individual vectors

Linear dependence is defined between sets where each contains only a single basis vector. The sense in which individual basis vectors like this can be linearly dependent is the simplest of all: it is only if they are multiples of each other. For example, 12 19 28] and 24 38 56] are linearly dependent, because 24 38 56] = 212 19 28]. But 12 19 28] and 12 19 27] are not.

### Linear dependence between temperaments

The conditions of temperament addition motivate a special definition of linear dependence for temperaments. For more information, see: Temperament addition#2. Linear dependence between temperaments.

## RTT applications involving linear dependence

### Wedge product

Linear dependence has an interesting effect on the wedge product, which otherwise produces the same result on a set of vectors as one would get by treating those same vectors as basis matrices and performing a temperament merge. The wedge product of any two linear dependent multivectors will have all zeros for entries, and thereby not represent an interesting new temperament (whereas the wedge product for linearly independent multivectors does represent an interesting new temperament sharing properties of the input temperaments) (and where the equivalent temperament merge operation in linear algebra would provide such an interesting temperament). For more information, see: Douglas Blumeyer and Dave Keenan's Intro to exterior algebra for RTT#Linearly dependent exception

## Variance

Linear dependence is defined both for basis vector sets whether they are covariant ("covectors", such as maps) or contravariant (plain "vectors", such as prime-count vectors). For simplicity, this article will use the word "vector" in its general sense, which includes either plain/contravariant vectors or (covariant) covectors.

Plain vectors and covectors cannot be compared with each other, however. Linear dependence is only defined for a set of basis vector sets, or a set of basis covector sets. Linear dependence is not defined for a set including both basis vector sets and basis covector sets. For example, a set including one mapping (a basis covector set) and one comma basis (a basis "plain-vector" set) has no directly meaningful notion of linear dependence. So, while it is convenient to use "vector" for either type, it is important to be careful to use only on type at a time, never mixing the two types.

## Versus collinearity

Basis vector sets would be considered collinear if, not only were they linearly dependent, every vector able to be formed from any of their basis vectors can all be reduced to single basis vector. This would mean that literally every formable vector in any of the basis vector sets would fall along the same geometric line. So this is the same as the notion of collinearity in geometry, where three or more points found on the same line are said to be collinear, which also works for a set of lines or line segments being along the same line. And in geometrical terms, a vector could be considered to be a directed line segment.

# Footnotes

1. Mappings are not typically thought of as bases, but their row vectors can be considered to span rowspaces in an analogous way that comma bases span spaces.
2. This article will also use "multivector" to refer to either plain/contravariant multivectors or (covariant) multicovectors (elsewhere on the wiki you will find "varianced multivector" to refer unambiguously to either type in the general sense).
3. though the two temperaments here — the one defined by this mapping, and the other defined by this comma basis — can have a notion of linear dependence, as can be understood by finding the comma basis that is the dual of the mapping and checking the two comma bases for linear dependence, or vice versa, finding the mapping that is the dual of the comma basis and checking the two mappings for linear dependence. This notion of linear dependence is discussed in more detail here.