Talk:Transversal generators
This page also contains archived Wikispaces discussion.
can anyone explain why the algorithm works?
I'm the person who added the "example" section as well as the "Wolfram Language implementation" there, so I understand how to do it, but I have no real idea why it works.
It's clear enough to me that the pseudoinverse step is how we convert the input mapping into generators. But then how do we "massage" those generators by combining them one at a time with the comma basis, defactoring, dot-producting with the original row to get a thing we prepend before HNF'ing so that it has some effect on what remains when we grab that row back out...? It's quite tricksy.
--Cmloegcmluin (talk) 18:04, 6 October 2021 (UTC)
- Here's how it works:
- We want to find u such that V*u = [0, ..., 0, 1, 0, ..., 0] where the 1 is on the i-th index.
- Let's split up the problem:
- V[i]*u = 1
- V[no_i]*u = [0, 0, ..., 0] (where V[no i] is V with the i-th row deleted)
- First calculate S. this is a kernel/comma basis for V[no_i].
- The vector we want to find is in the rowspace of S. (aka it is a linear combination of its rows.)
This is because V[no i]*u = [0, 0, ... , 0] <=> u is in the kernel of V[no_i]. - By calculating R = V[i] * S, we are basically calculating the number of steps V[i] maps each comma of S to.
- Build the block matrix [R | S] (prepending R to S)
- If we do some row operations on [R | S], it stays valid. Say we multiply it by an arbitrary unimodular matrix U:
[R' | S'] = U*[R | S] = [U*R | U*S]
This is true because R = V[i]*S, so U*R = U*V[i]*S = V[i]*U*S
=> R' = V[i]*S' - Since we want V[i]*u = 1, we want to find some U so that R' = [1,0,..,0]. This is exactly what the HNF does. The first row of S' is then the vector u we are after.
- I have left out some details and matrix transposes. Hopefully this helps clear things up.
- -Sintel (talk) 00:51, 18 December 2021 (UTC)
- Thanks as always Sintel. I haven't taken the time to add this to the main page because I haven't understood it fully myself yet. But you are welcome to do so. --Cmloegcmluin (talk) 19:05, 20 January 2022 (UTC)
Simpler algorithm
The algorithm given in the text is needlessly complex. I will give a simpler one here:
Given a saturated mapping matrix M, calculate the Smith normal form:
[math] D = LMR [/math]
where D is rectangular diagonal and L, R are unimodular. The transversals can be found as the columns of X:
[math] X = R D^{\mathsf T} L [/math]
It works because we have:
[math] D = LMR \\ \Rightarrow M = L^{-1}DR^{-1} [/math]
Because L and R are invertible in [math]\mathbb{Z}[/math], by the definition of the Smith normal form. We want to solve:
[math] MX = I \\ \Rightarrow L^{-1}DR^{-1}X = I \\ \Rightarrow DR^{-1}X = L [/math]
Now the upper left submatrix of [math]D^{\mathsf T}D[/math] is identity iff M is saturated, so:
[math] \Rightarrow R^{-1}X = D^{\mathsf T} L \\ \Rightarrow X = R D^{\mathsf T} L \quad \text{qed.} [/math]
-Sintel (talk) 18:51, 18 December 2021 (UTC)
- Love it! I've added my interpretation of this to the main page. Feel free to tweak if you see necessary. --Cmloegcmluin (talk) 19:05, 20 January 2022 (UTC)