# Addition in Other Domains

There is a class of multivariate functions that have some kind of “univariate character” except that they “combine effects”. *A+B* is king of them. Some of his subordinates are:

These expressions are all commonly used. Number 3 is so common that it has it’s own shorthand: .

Let’s define a family of operators

where is a monotonic univariate function that takes us to the domain in which the summing happens, and brings us back. So by definition…

Those three expressions listed in the beginning then become:

- ( because )
- ( because )
- ( because )

We can combine, for example, parallel resistors like this:

and independent Gaussian deviates like this:

Short-hand could be useful in itself, but what properties/identities might these operators permit? It isn’t hard to verify that the entire family inherits the associative and commutative properties of it’s inner champion . This symmetry allows us to construct multivariate expressions in the exact same way as it works for :

In calculus we have the product rule to help us with . The generalization for turns out to be

and again, this generalizes straightforwardly to higher dimensions.

The proof isn’t difficult, but for now let’s see how the product rule emerges as a special case of this rule:

Might this family yield more identities that allow it to share in the abilities of its special cases ( and ) to reduce complex expressions to simple insights?

Giving these expressions full multivariate-function status just seems like overkill. The area and the diagonal length of a (b&w) photograph are simpler things than the photograph itself, and deserve a simpler abstraction.

TODO: Check more identities.

TODO: Note on nomograms. Each of , , has a simple visualization using perpendicular axes.

This reminds me of the “under” operator from J which I heard from an article by James Hague: http://prog21.dadgum.com/121.html

For example, the three plus operators could be called “Sum under reciprocal”, “Sum under square”, “Sum under log”