Talk:Linear dependence
Jump to navigation
Jump to search
Collinearity vs linear dependence
Collinearity is not the same as linear dependency. Say you put your row vectors in a matrix. The vectors are said to be collinear if the matrix is rank 1, and they are linearly dependent if the matrix is rank deficient (ie. has rank smaller than the number of vectors). This article actually seems to just be about linear (in)dependence, so just call it that.
- Sintel (talk) 22:54, 18 December 2021 (UTC)
- Thanks for the critical feedback! This page is still potentially due for some revision as I iron out the final insights regarding it and temperament arithmetic. Okay. I see the difference in meaning. I came to this conclusion from interpreting a post I saw in the old Yahoo Groups thread archives. Perhaps in xenharmonics the usage of the word got stretched a bit. So would you say my section on this page "Vs geometry" is just totally inaccurate? If so, I can delete it.
- My remaining concern, then, is that I needed a word to describe a condition on the possibility of temperament arithmetic. And I called it, for now, "monononcollinearity", and it builds upon the notion of collinearity that is described here, i.e. linear dependence. So I should replace that term, I suppose, with "singularly linearly independent", meaning when two temperaments share all but one vector. The problem is that I need to use that word *A LOT*, and "singularly linearly independent" is a mouthful. Do you happen to know if there's already an established term for this? I tried searching online for it but met no success. If you entry-wise add multivectors that are not singularly linearly independent, you get a multivector that is nondecomposable, i.e. cannot be expressed as the wedge product of vectors. --Cmloegcmluin (talk) 03:16, 19 December 2021 (UTC)
- I would avoid anything with "singular" since that has yet another meaning. How about just talking about rank deficiency? Something like 1-deficient would be clear enough if you define it clearly.
- The section on geometry is inaccurate yeah. afaik "collinear" just comes from lying on the same (com-) line (linear).
- Alright, now that I understand the difference between collinearity and linear dependence clearly, I will remove the geometry section or drastically revise it. Thanks for explaining the difference; Dave (Keenan) and I were concerned about the difference and tried to figure it out for ourselves, but failed to do so.
- Also, thanks for your advice to talk about rank deficiency instead. I agree 1-deficient could work well in some cases, such as the rows of a single temperament's mapping matrix. However, unfortunately, this is a bit different... it's about comparing two temperaments with each other. Let me give an example.
- One comma basis for 11-limit meantone is ⟨[4 -4 1 0 0⟩ [-1 2 0 -2 1⟩ [1 2 -3 1 0⟩], and one basis for 11-limit mothra is ⟨[4 -4 1 0 0⟩ [-1 2 0 -2 1⟩ [-7 -1 1 1 1⟩]. As you can see, they share 2 out of 3 vectors. So they only have 1 unshared vector.
- I've been calling these types of temperament pairs "monononcollinear"; it's the compared pair that's monononcollinear, not either one of them individually. But that term is built on an incorrect definition of "collinear", so I should call them "mono- linearly independent", but that's no good. So as far as you know, there's no established term for this?
- An example that is not monononcollinear: migration ⟨[4 -4 1 0 0⟩ [1 2 -3 1 0⟩ [-1 5 0 0 -2⟩], and baragon ⟨[4 -4 1 0 0⟩ [-7 -1 1 1 1⟩ [7 -4 0 1 -1⟩]. They only share one vector, the meantone comma. This means that if you convert both of these matrices to multivectors, and sum them, the resultant multivector is nondecomposable, which means it doesn't represent a usable temperament. --Cmloegcmluin (talk) 04:36, 19 December 2021 (UTC)