# Lumma stability

**Lumma stability** is a concept in the theory of musical scales developed by Carl Lumma. It is the portion of a scale's period which is not covered by its interval classes. The "Lumma impropriety factor" is the portion which is more than singly-covered. Scala will report these for any scale via its "show data" command.

Rothenberg assumes a process whereby a listener hears melodic intervals between sounds and sorts them by size. This produces an "interval matrix" (IM) in the listener's mind. The listener continually refines this IM as new sounds arrive, and if a sound source conforms to a single IM long enough, he may remember it for later use.

The IM for a scale is simply a list of all its intervals (dyads) grouped by the modes in which they appear. Scala will display it with "show/line intervals". Every fixed scale corresponds to one and only one IM. So if we assume that listeners will eventually perfect their picture of a scale's IM, we can infer things about the scale from its IM.

Imagine a log-frequency ruler whose total length is the interval of equivalence of our periodic scale (e.g. 1200 cents). Mark off each interval in the IM on this ruler. Now we'll draw line segments on the ruler with colored pencil, using the marks as endpoints. We'll connect all marks belonging to the same interval class with a single line, using a different color for each interval class. Lumma stability is the portion of the ruler that has no pencil on it. The **impropriety factor** is the portion that's more than singly covered—where different colors overlap. The idea being that when two interval classes overlap, listeners will not be able to distinguish them in all cases. Lumma stability measures how easily distinguishable the non-overlapping classes will be. Rothenberg stability is very similar, except it counts the number of overlaps, so it can't distinguish a gross overlap (in cents) from a small one, or detect when two interval classes are arbitrarily close to overlapping.