Talk:Interior product

From Xenharmonic Wiki
Jump to navigation Jump to search

This page also contains archived Wikispaces discussion.

Questions, observations, suggestions

This is an adaptation of something I worked on with Dave Keenan recently. We posted it on Facebook in an earlier form and due to a combination of some technical errors we made, not choosing the most appropriate tone, bad timing, etc. it didn't go over very well, and I'm sorry that I did it that way at first. Dave Keenan no longer has any involvement in this effort and wishes to be left out of it. But I didn't want to let the effort we made together to go to waste, and I still think the insights we attained while studying this material would be a valuable thing to offer to the community to eventually let it do what it will with. So this time around I've corrected the mistakes and trying to take a different tone. I'm not assuming my perspective is correct and pitching this as a necessary correction. These are just my questions and observations and suggestions.

This article says that "the interior product is a notion dual to the wedge product". This is similar to the language in the Mathworld article https://mathworld.wolfram.com/InteriorProduct.html which says "the interior product is a dual notion of the wedge product".

But I think it might be more accurate to say the interior product is only almost dual to the wedge product. Perhaps that's why they took the edge off and only cast it as a "dual notion", whatever that means exactly. The actual dual of the wedge product seems to be the regressive product. The wedge product is sometimes called the progressive product in contrast, because the wedge product increases the grade of a multivector or multicovector while the regressive product decreases it (the grade of a multivector is nullity; the grade of a multicovector is rank).

https://math.stackexchange.com/a/971881 https://www.cefns.nau.edu/~schulz/grassmann.pdf

I can definitely see why the interior product might be thought of as the dual of the wedge product, because the wedge product is also sometimes called the exterior product.

After saying that the interior product is the dual, the article says "so we will denote it using ∨ rather than ∧." But I'm pretty sure that the vee symbol ∨ is standard for the regressive product, not for the interior product. As the Mathworld article shows, the standard symbol for the (left) interior product is ⨼. In fact the Unicode name for that symbol is INTERIOR PRODUCT (U+2A3C). It can be seen as a tilted asymmetrical vee. And the right interior product's symbol is ⨽, just horizontally flipped. (I'll say more about left vs right in a bit)

Here is the regressive product or "vee product" defined in terms of the wedge product, where ∗ gives the dual:

a ∨ b = ∗(∗a ∧ ∗b)

For comparison, here is the right (or righthand) interior product defined in terms of the wedge product:

a ⨽ b = ∗(∗a ∧ b)

And by combining these, the relationship between the right interior product and the regressive product:

a ⨽ b = a ∨ ∗b

So you can see the sense in which the right interior product is only "almost dual" to the wedge product, while the regressive product is properly its dual. My understanding is that if two operations are duals of each other, it means that if you take the dual of all the arguments before the operation and then take the dual of the result, you get the same answer you would with the original operation. A classic example would be the duality between AND and OR in classical logic via De Morgan's law, A | B = ~(~A & ~B). So if we said wedge was like AND, then regressive would be like OR, and the interior product would be more like IMPLES, because A⇒B = ~A | B = ~(A & ~B).

Okay, but why did I suddenly start talking about a "right" and "left" interior product? As you might expect, they're quite similar to each other. The only difference is which input, a or b, you take the dual of:

a ⨼ b = ∗(a ∧ ∗b)

So it is similarly "almost dual". And again, here's in terms of the regressive product:

a ⨼ b = ∗a ∨ b

I can clearly see that the wiki article is not mistakenly using the regressive product with the regressive product symbol ∨, because it makes it clear that its two arguments come from opposite sides of the duality. i.e. one is a multivector and the other is a multicovector. The wiki article is definitely using an interior product. But which one is it: the right, or left? What the article seems to be using, as far as I can tell, is what could be called a "symmetrical interior product". The way that works is:

a • b = if grade(a) ≥ grade(b), a ⨽ b; else a ⨼ b

In other words, this operation figures out which one to use, left or right, and then does it.

The reason it does this is because if you get the grades the other way around, the output grade will be 0 or less. In other words, you receive a scalar, which could be either a multicovector or a multivector. Whatever this thing is, it's certainly a trivial temperament: either everything or nothing is tempered out (I'm really not sure which in this case). Generally what you will be looking for is a case where you receive something with grade 1 or more.

The reason I think this article describes a symmetrical interior product is because of the last paragraph of the "definition" section. I think there may be an error there where it says "we can take the wedge product m∨W from the front". I think what it's showing is actually an example of the interior product, and this is the one place in the article where I see the input on the left having a smaller grade than the thing on the right. And I think the last clause of this sentence "this can only lead to a difference in sign" could be continued with the phrase "compared to W∨m", i.e. the inputs reordered so that the lower-grade input is on the right. So if it's possible to reorder the inputs like this, then that implies this article assumes the symmetrical interior product. The change in sign is due to how the wedge product is sometimes anticommutative. It always is for (mono)vectors or (mono)covectors, but I think it's commutative for even grades. I think the statement means "it can, at most" lead to a mere change in signs, but won't necessarily; for example, I did [-3 2 -1 2 -1⟩ ⨼ ⟨⟨⟨1 2 -3 -2 1 -4 -5 12 9 -19]]] and [-3 2 -1 2 -1⟩ ⨼ ⟨⟨⟨1 2 -3 -2 1 -4 -5 12 9 -19]]] and they both give me ⟨⟨6 -7 -2 15 -25 -20 3 15 59 49].

Any one of these products has a formula which can tell you what the output grade will be. For the progressive product it's simply g(a) + g(b). For the regressive product it's g(a) + g(b) - d. For the left interior product it's g(a) - g(b), and for the right interior product it's g(b) - g(a). All of these formulas max out at d (can't have higher grade than dimensionality) and min out at 0 (no such thing as negative grade). I can give derivations for these if anyone wants.

So if we simply wanted to take what is on the page now and help it conform better with established mathematical usages, I would recommend we remove the line about interior being the wedge's dual, and change the symbol the wiki uses for the interior product. Above I used •, the fat dot, which came up on that Facebook post recently as a reasonable choice for this operation. I'm not attached to it though so if anyone has other suggestions I'm not opposed at all to considering them. I'm just concerned that ∨ probably should only be used for the regressive product. If we went this direction, then ∨ would also need to be replaced with • on the following pages, too:

But I think we can do even better than simply massaging what we've already got here to be more correct, but stay basically exactly how it is. I think we could do some further upgrading here. While one could definitely make the case for the symmetry of the interior product being a convenience, I'm concerned that the pros may not outweigh the cons. The conditional overcomplicates things, making it difficult to reason about the operator or solve equations containing it. Consider the subtraction or division operators: we've just chosen one orientation and stuck with it, and if necessary we swap the order of operators, or take the absolute value. I think it might be better if we just picked one or the other of left or right interior product and went with it. The wiki article currently mostly uses the right interior product, and then the left interior product in one case (as I showed above), so perhaps we should choose right; I think that's the way that parallels subtraction and division more closely as well. I would rather that, using the example above, m∨W gave 0, rather than the same thing as W∨m but with a potentially flipped sign. So that'd be one stage further of a suggested revision. If we went this route, I should say by the way that good ASCII approximations of ⨼ and ⨽ are -' and '- (hyphens, and single-quotes) so you can write things like a -' b = a \/ *b or a '- b = *a \/ b when you need to.

But I think there's an even better solution than reforming the use of the interior product. I would propose that we should actually just switch to using the regressive product, the one that uses the ∨ symbol. It's even simpler, more generally useful, and has a simpler relationship with what people already know (the wedge/progressive product).

Here's my full breakdown:

You can take the progressive product of two things with the same variance and get something with greater (or equal to greatest input) grade but same variance.

Progressing two temperaments' multivectors together gives a multivector for the same temperament associated with the comma basis you'd find by concatting and reducing the comma bases for the same two original temperaments, so that's like meet, but instead of being defined for temperaments in the abstract, it's the operation you perform on multivectors to achieve it. The grade of the output is equal to the sum of the two input multivectors' grades (or less, if they share a comma in common), so it's either the same or greater than the greater of the two inputs' grades, and also it caps out at the dimensionality of the system.

Progressing two temperaments' multicovectors together gives a multicovector for the same temperament associated with the mapping you'd find by concatting and reducing the mappings for the same two original temperaments, so that's like join, but instead of being defined for temperaments in the abstract, it's the operation you perform on multicovectors to achieve it. The grade of the output is the same idea as the previous statement (although it's less, in this case, if they share a mapping-row in common).

And you can take the regressive product of two things with the same variance and get something with lesser (or equal to lowest input) grade but same variance.

Regressing two temperaments' multivectors together gives a multivector for the same temperament associated with the mapping you'd find by concatting and reducing the mappings for the same two original temperaments, so that's like join, but instead of being defined for temperaments in the abstract, it's the operation you perform on multicovectors to achieve it. The grade of the output is equal to....... well, in the case of two multivectors, so two n=2, and d=3, you take both of their duals, so those grades are d-n=1, then wedge those, so sum them, so that's 2, but then take that's dual again so it's back to 1. Again it must be equal or less to the lesser of the two input's grades (depending on if they share a comma in common), and caps out at 0 (you can't go negative grade (negative grade is a different idea than the sign we're adding to our grade in Wolfram to supply the variance information too in a single handy package)).

Regressing two temperaments' multicovectors together gives a multicovector for the same temperament associated with the comma basis you'd find by concatting and reducing the comma bases for the same two original temperaments, so that's like meet, but instead of being defined for temperaments in the abstract, it's the operation you perform on multivectors to achieve it. The grade of the output is the same idea as the previous statement.

So the progressive and regressive products are flip-flopped in that way. And it makes sense, because of how the regressive product just takes the duals of both inputs, so it does the opposite operation to what it would if it was a straight wedging, and then takes the dual again at the end just so you get back something with the variance you put in.

But what about interior product? Well, the output of the right interior product always has the same variance and equal or lower grade than its left input, and equal or lower grade than the dimension minus the grade of its right input. So it can have grade either higher, lower or same as it's right input.

Based on a⨽b = ∗(∗a∧b), if the left grade is r and the right grade is n, the output grade will be d-((d-r)+n) which simplifies to r-n.

e.g.: Dimension 7. Left input rank 6. Right input nullity 2. Output rank = r-n =4 (greater than right grade) Dimension 7. Left input rank 6. Right input nullity 3. Output rank = r-n =3 (same as right grade). Dimension 7. Left input rank 6. Right input nullity 4. Output rank = r-n =2 (less than right grade).

So if the left input is a multicovector, then the right interior product is like join. And if the right input is a multivector, then it is like meet. And you can infer the behavior for the left interior product from that.

So I think it should be pretty clear from this breakdown that the interior product, either one, is more confusing, complicated, and convoluted than just using the regressive product. There doesn't seem to be anything you can do with it that you couldn't get done with the regressive product, which is generally quicker and cleaner.

To be clear, this is absolutely not meant as a criticism of Gene. I never would have been able to even get started on this type of stuff if it weren't for his explorations and documentation here. But reviewing all this, I am wondering if maybe he just missed the regressive product, or had a misconception, because otherwise it seems like he would have just used it. Of course, it's also highly probable that there is a specific reason he didn't use it, which I'm unaware of and haven't been able to figure out, but if that is so, I think it could be good to discover it and document it.

I don't have a strong preference which way people want to go with this information. And certainly some combination of these recommendations could be done. I am very open to hearing that there are cases where interior product does exactly what you want and regressive wouldn't cut it. I haven't gone through every use on the wiki (and outside it) to check. Also there's the argument that this interior product defined as it is has been around for a while and maybe even if it has some issues we should preserve it for backwards compatibility. All fine.

If it's of any assistance, I've implemented all of these products in Wolfram Language: https://github.com/cmloegcmluin/VEA

And here's a summary table that has helped me get my head around this situation:

operations progressive product (AKA wedge product, exterior product)

a ∧ b

regressive product (AKA vee product)

a ∨ b = *(*a ∧ *b)

right interior product

a ⨽ b = ∗(∗a ∧ b)
examples given where grade(a) ≥ grade(b)

(left) interior product

a ⨼ b = *(a ∧ *b)
examples given where grade(a) < grade(b)

symmetrical interior product

a • b = if grade(a) ≥ grade(b), a ⨽ b; else a ⨼ b

resultant grade grade(a) + grade(b) grade(a) + grade(b) - dimensionality grade(a) - grade(b) grade(b) - grade(a) if grade(a) ≥ grade(b), grade(a) - grade(b); else grade(b) - grade(a)
resultant variance same as a (and b) same as a (and b) same as a same as b if grade(a) ≥ grade(b), same as a; else same as b
multicovector with multicovector

⟨] ⟨]

⟨12 19 28 34] ∧ ⟨19 30 44 53] = ⟨⟨1 4 10 4 13 12]] ⟨⟨⟨1 2 -3 -2 1 -4 -5 12 9 -19]]] ∨ ⟨⟨⟨⟨1 2 1 2 3]]]] = ⟨⟨-6 7 2 -15 25 20 -3 -15 -59 -49]] ND ND ND
multivector with multivector

[⟩ [⟩

[4 -4 1 0⟩ ∧ [13 -10 0 1⟩ = [[12 -13 4 10 -4 1⟩⟩ [[44 -30 19⟩⟩ ∨ [[28 -19 12⟩⟩ = [4 -4 1⟩ ND ND ND
multicovector with multivector

⟨] [⟩

ND ND ⟨⟨⟨1 2 -3 -2 1 -4 -5 12 9 -19]]] ⨽ [-3 2 -1 2 -1⟩ = ⟨⟨6 -7 -2 15 -25 -20 3 15 59 49]] ⟨12 19 28] ⨼ [[44 -30 19⟩⟩ = [4 -4 1⟩ (in terms of other two interior products)
multivector with multicovector

[⟩ ⟨]

ND ND [[44 -30 19⟩⟩ ⨽ ⟨12 19 28] = [-4 4 -1⟩ [-3 2 -1 2 -1⟩ ⨼ ⟨⟨⟨1 2 -3 -2 1 -4 -5 12 9 -19]]] = ⟨⟨-6 7 2 -15 25 20 -3 -15 -59 -49]] (in terms of other two interior products)