Monoidal sparks

While at ICFP in St. Louis this past week, I discovered an interesting construction on monoids that no one I talked to (including Kenny Foner and Edward Kmett) seemed to have heard of or thought about before. In this post I’ll present the abstract construction itself, and in another post I’ll explain the particular context in which I first came up with it. (Normally I would put the examples first, before explaining the idea in general; but I only know of one example so far, and I’m curious to see if anyone will come up with more. I don’t want to bias people’s thinking too much with my one example!)

The bigger context here is thinking about different ways of putting a monoid structure on a product type A \times B, assuming A and B are themselves monoids. Previously I knew of two ways to do this. The obvious way is to just have the two sides operate in parallel, without interacting at all: (a_1,b_1) \diamond (a_2,b_2) = (a_1 \diamond a_2, b_1 \diamond b_2) and so on. Alternatively, if A acts on B, we can form the semidirect product.

But here is a third way. Suppose (A, \varepsilon_A, \diamond) is a monoid, and (B, \varepsilon_B, \oplus) is a commutative monoid. To connect them, we also suppose there is another binary operation - \cdot - : A \to A \to B, which I will pronounce “spark”. The way I like to think of it is that two values of type A, in addition to combining to produce another A via the monoid operation, also produce a “spark” of type B. That is, values of type B somehow capture information about the interaction between pairs of A values.

Sparking needs to interact sensibly with the monoid structures on A and B: in particular, fixing either argument must result in a monoid homomorphism from A to B. That is, for any choice of a \in A, we have a \cdot \varepsilon_A = \varepsilon_B (i.e. \varepsilon_A is boring and can never produce a nontrivial spark with anything else), and a \cdot (a_1 \diamond a_2) = (a \cdot a_1) \oplus (a \cdot a_2) (i.e. sparking commutes with the monoid operations). Similar laws should hold if we fix the second argument of - \cdot - instead of the first.

Given all of this, we can now define a monoid on A \times B as follows:

(a_1, b_1) \otimes (a_2, b_2) = (a_1 \diamond a_2, b_1 \oplus b_2 \oplus (a_1 \cdot a_2))

That is, we combine the A values normally, and then we combine the B values together with the “spark” of type B produced by the two A values.

Let’s see that this does indeed define a valid monoid:

  • The identity is (\varepsilon_A, \varepsilon_B), since

    (\varepsilon_A, \varepsilon_B) \otimes (a,b) = (\varepsilon_A  \diamond a, \varepsilon_B \oplus b \oplus (\varepsilon_A \cdot a)) =  (a, b \oplus \varepsilon_B) = (a,b)

    Notice how we used the identity laws for A (once) and B (twice), as well as the law that says \varepsilon_A \cdot a =  \varepsilon_B. The proof that (\varepsilon_A, \varepsilon_B) is a right identity for \otimes is similar.

  • To see that the combining operation is associative, we can reason as follows:

    \begin{array}{rcl} & & ((a_1,b_1) \otimes (a_2,b_2)) \otimes (a_3,b_3) \\[0.5em] & = & \qquad \text{\{expand definition of \begin{math}\otimes\end{math}\}} \\[0.5em] & & (a_1 \diamond a_2, b_1 \oplus b_2 \oplus (a_1 \cdot a_2)) \otimes (a_3,b_3) \\[0.5em] & = & \qquad \text{\{expand definition of \begin{math}\otimes\end{math} again\}} \\[0.5em] & & (a_1 \diamond a_2 \diamond a_3, b_1 \oplus b_2 \oplus (a_1 \cdot a_2) \oplus b_3 \oplus ((a_1 \diamond a_2) \cdot a_3)) \\[0.5em] & = & \qquad \text{\{\begin{math}- \cdot a_3\end{math} is a homomorphism, \begin{math}\oplus\end{math} is commutative\}} \\[0.5em] & & (a_1 \diamond a_2 \diamond a_3, b_1 \oplus b_2 \oplus b_3 \oplus (a_1 \cdot a_2) \oplus (a_1 \cdot a_3) \oplus (a_2 \cdot a_3)) \end{array}

    and

    \begin{array}{rcl} & & (a_1,b_1) \otimes ((a_2,b_2) \otimes (a_3,b_3)) \\[0.5em] & = & \qquad \text{\{expand definition of \begin{math}\otimes\end{math}\}} \\[0.5em] & & (a_1,b_1) \otimes (a_2 \diamond a_3, b_2 \oplus b_3 \oplus (a_2 \cdot a_3)) \\[0.5em] & = & \qquad \text{\{expand definition of \begin{math}\otimes\end{math} again\}} \\[0.5em] & & (a_1 \diamond a_2 \diamond a_3, b_1 \oplus (b_2 \oplus b_3 \oplus (a_2 \cdot a_3)) \oplus (a_1 \cdot (a_2 \diamond a_3))) \\[0.5em] & = & \qquad \text{\{\begin{math}a_1 \cdot -\end{math} is a homomorphism, \begin{math}\oplus\end{math} is commutative\}} \\[0.5em] & & (a_1 \diamond a_2 \diamond a_3, b_1 \oplus b_2 \oplus b_3 \oplus (a_1 \cdot a_2) \oplus (a_1 \cdot a_3) \oplus (a_2 \cdot a_3)) \end{array}

    In a formal proof one would have to also explicitly note uses of associativity of \diamond and \oplus but I didn’t want to be that pedantic here.

In addition, if A is a commutative monoid and the spark operation - \cdot - commutes, then the resulting monoid (A \times B, \otimes) will be commutative as well.

The proof of associativity gives us a bit of insight into what is going on here. Notice that when reducing (a_1,b_1) \otimes (a_2,b_2) \otimes (a_3,b_3), we end up with all possible sparks between pairs of a’s, i.e. (a_1 \cdot a_2) \oplus (a_1 \cdot a_3) \oplus (a_2 \cdot a_3), and one can prove that this holds more generally. In particular, if we start with a list of A values:

[a_1, a_2, \dots, a_n],

then inject them all into A \times B by pairing them with \varepsilon_B:

[(a_1, \varepsilon_B), (a_2, \varepsilon_B), \dots, (a_n, \varepsilon_B)],

and finally fold this list with \otimes, the second element of the resulting pair is

\displaystyle \bigoplus_{i \neq j} (a_i \cdot a_j),

that is, the combination (via the monoid on B) of the sparks between all possible pairings of the a_i. Of course there are O(n^2) such pairings: the point is that whereas computing this via a straightforward fold over the list may well take O(n^2) time, by using a balanced fold (i.e. splitting the list in half recursively and then combining from the leaves up) it may be possible to compute it in O(n \log n) time instead (at least, this is the case for the example I have in mind!).

Please leave a comment if you can you think of any instances of this pattern, or if you have seen this pattern discussed elsewhere. In a future post I will write about the instance I had in mind when coming up with this generalized formulation.

Advertisements
Posted in math | Tagged , , , , | 27 Comments

New ICFP functional pearl on subtracting bijections

Kenny Foner and I have a paper accepted to ICFP on subtracting bijections. Here’s the basic problem: suppose you have a bijection h between two sum types, h : A + B \leftrightarrow A' +B', and another bijection g : B \leftrightarrow B'. Of course A and A' must have the same size, but can you construct an actual bijection f between A and A'?

This problem and its solution has been well-studied in the context of combinatorics; we were wondering what additional insight could be gained from a higher-level functional approach. You can get a preprint here.

Posted in combinatorics, haskell, writing | Tagged , , , , , | 5 Comments

Ertugrul Söylemez: 1985-2018

I do not know many details, but IRC user mupf joined the #haskell-ops channel yesterday to tell us that Ertugrul Söylemez died unexpectedly on May 12. This is a great loss for the Haskell community. Going variously by mm_freak, ertes, ertesx or ertes-w, Ertugrul has been very active on the #haskell IRC channel for almost 11 years—since June 2007—and he will be remembered as a knowledgeable, generous, and patient teacher. In addition to teaching on IRC, he wrote several tutorials, such as this one on foldr and this one on monoids. The thing Ertugrul was probably most well-known for in the Haskell community was his netwire FRP package, which he had developed since 2011, and more recently its successor package wires. I think it is no exaggeration to say that his work on and insights into FRP helped shape a whole generation of approaches.

He will be greatly missed. I will leave the comments open for anyone to share memories of Ertugrul, his writing, or his code.

Posted in haskell, meta | Tagged , , , , | 4 Comments

Future of Coding podcast interview

I recently did an interview with my friend and former student Steve Krouse for his Future of Coding podcast. I think I still agree with almost everything I said, and I didn’t say anything too embarrassing, so if you want to listen to it I won’t stop you! We chatted about teaching and learning, functional programming and type systems, how to read academic papers, and a bunch of other things. You can listen to it here.

Posted in haskell, meta, teaching | Tagged , , , , | Leave a comment

Apply to attend the Oregon Programming Languages Summer School!

Although the official deadline has passed, I hear there are still a few spots left for the Oregon Programming Languages Summer School (OPLSS), hosted at the University of Oregon in Eugene. This is a really fantastic opportunity to take a deep dive in programming languages and type theory. I attended as a graduate student and although it was overwhelming at times—like drinking from a fire hose!—it was a wonderful experience. I made some great friends and learned a ton (I still occasionally refer back to my notes!). Eugene is also beautiful in the summer (I have fond memories of going geocaching in a nearby park).

The main summer school (July 9–21) has a focus on parallelism and concurrency, but for those with less background, there is also a one-week introduction on the foundations of programming languages (July 3–8). If you want, you can actually choose to attend only the July 3–8 session (or both, or just July 9–21 of course). Attending just the introductory session could be a great option for someone who knows some functional programming but wants a more solid grounding in the underlying theory.

If you’re a member of the Haskell/FP community who is thinking of applying, and I’m familiar with you or your work, I would be happy to write you a recommendation if you need one.

Posted in links, meta | Tagged , , , , , , | Leave a comment

Conversations with a six-year-old on functional programming

My six-year-old son walked up to me yesterday. “What are you reading?”

At the time, I was reading part of Janis Voigtländer’s habilitation thesis. Unsure where to even start, I decided to just answer straightforwardly: “I’m reading a very long story about free theorems.”

He persisted. “What are free theorems?”

Never one to shrink from a pedagogical challenge, I thought for a moment, then began: “Do you know what a function is?” He didn’t. “A function is like a machine where you put something in one end and something comes out the other end. For example, maybe you put a number in, and the number that is one bigger comes out. So if you put in three, four comes out, or if you put in six, seven comes out.” This clearly made sense to him, so I continued, “The type of a function machine tells you what kinds of things you put in and what kinds of things come out. So maybe you put a number in and get a number out. Or maybe you put in a list of numbers and get a number out.” He interrupted excitedly, “Or maybe you could put words in??” “Yes, exactly! Maybe you can put words in and get words out. Or maybe there is a function machine where you put other function machines in and get function machines out!” He gasped in astonishment at the idea of putting function machines into function machines.

“So,” I concluded, “a free theorem is when you can say something that is always true about a function machine if you only know its type, but you don’t know anything about what it does on the inside.” This seemed a bit beyond him (and to be fair, free theorems are only interesting when polymorphism is involved which I definitely didn’t want to go into). But the whole conversation had given me a different idea.

“Hey, I have a good idea for a game,” I said. “It’s called the function machine game. I will think of a function machine. You tell me things to put into the function machine, and I will tell you what comes out. Then you have to guess what the function machine does.” He immediately liked this game and it has been a huge hit; he wants to play it all the time. We played it while driving to a party yesterday, and we played it this morning while I was in the shower. So far, he has correctly guessed:

  • \lambda x.\, x + 1
  • \lambda x.\, x - 3
  • \lambda x.\, 10x
  • \lambda x.\, 2x
  • \lambda x.\, 6
  • \lambda w\!:\!\text{English noun}.\, \text{plural}(w)
  • \lambda x.\, 10 \lfloor x/10 \rfloor

I tried \lambda x.\, \min(x+2,10) but that was a bit tough for him. I realized that in some cases he may understand intuitively what the function does but have trouble expressing it in words (this was also a problem with \lambda x.\, 10 \lfloor x/10 \rfloor), so we started using the obvious variant where once the guesser thinks they know what the function does, the players switch roles and the person who came up with function specifies some inputs in order to test whether the guesser is able to produce the correct outputs.

\lambda x.\, 6 was also surprisingly difficult for him to guess (though he did get it right eventually). I think he was just stuck on the idea of the function doing something arithmetical to the input, and was having trouble coming up with some sort of arithmetic procedure which would result in 6 no matter what you put in! It simply hadn’t occurred to him that the machine might not care about the input. (Interestingly, many students in my functional programming class this semester were also confused by constant functions when we were learning about the lambda calculus; they really wanted to substitute the input somewhere and were upset/confused by the fact that the bound variable did not occur in the body at all!)

After a few rounds of guessing my functions, he wanted to come up with his own functions for me to guess (as I knew he would). Sometimes his functions are great and sometimes they don’t make sense (usually because his idea of what the function does changes over time, which of course he, in all sincerity, denies), but it’s fun either way. And after he finally understood \lambda x. \min(x+2, 10), he came up with his own function which was something like

\lambda x:\mathbb{N}. \begin{cases} 10 - x & x \leq 10 \\ 10 & \text{otherwise} \end{cases}

inspired, I think, by his kindergarten class where they were learning about pairs of numbers that added up to 10.

Definitely one of my better parenting days.

Posted in teaching | Tagged , , , , , | 72 Comments

Call for submissions: FARM 2018

I’m the general chair this year for the ACM SIGPLAN International Workshop on Functional Art, Music, Modelling and Design (better known as FARM), which will be colocated with ICFP this September in St. Louis, Missouri, USA. The goal of the workshop is to “gather together people who are harnessing functional techniques in the pursuit of creativity and expression”. For me (and I think for many others as well), FARM, including the associated performance evening, is always one of the highlights of ICFP.

Please consider submitting something! As in previous years, we are accepting a wide range of types of submissions. Paper and demo submissions are due by June 28: you can submit original research, a technology tutorial, a call for collaboration, or propose some sort of live demo. In addition, we also solicit performances to be given during the associated performance evening—performance proposals are due by July 8.

See the website for more info, and let me know if you have any questions!

Posted in meta | Tagged , , , , , , , | 3 Comments