Monthly Archives: October 2009

Maximum entropy and negative probability

I have recently become fascinated with the concept of maximum entropy distributions, and went back and read Dan Piponi’s post on negative probabilities, and link surfing from there. Something sparked and I wondered what kind of connection there is between the two. A little experimenting in Mathematica later and I’m on to something curious.

First, a little background. E.T. Jaynes argues (so I have heard, I have not read the original) that if you have a set of constraints on a set of random variables and you would like a probability distribution over those variables, you should choose the distribution that has the most information entropy, as this is the “least biased” distribution.

The entropy of a distribution is defined as: H = -\sum_i{p_i \log{p_i}}.

I am using Dan’s example, and I will quickly recapitulate the situation. You have a machine that produces boxes of ordered pairs of bits. It is possible to look at only one bit of the pair at a time, say each bit is in its own little box. You do an experiment where you look at all the first bits of the boxes, and it always comes out 1. You do a second experiment where you look at the second bit of the boxes, and it, too always comes out 1.

Now, most reasonable people would draw the conclusion that the machine only produces boxes containing “1,1″. However, if we wholeheartedly believe in Jaynes’s principle, we have to look deeper before drawing a conclusion like that.

The 4 probabilities we are interested in correspond to “0,0″, “0,1″, “1,0″, “1,1″. I will write them as 4-vectors in that order. So an equal chance of getting any combination is written as 1/4 <1,1,1,1>.

For the distribution <a,b,c,d>, our constraints are: a+b+c+d = 1 (claiming our basis is complete), c+d = 1 (the first bit is always 1), b+d = 1 (the second bit is always 1).

The “reasonable” distribution is <0,0,0,1>, which indeed satisfies these constraints. The entropy of this distribution 0 (taking x log x = 0 when x = 0) — of course, there is no uncertainty here. But are there more distributions which satisfy the constraints?

Well, if you require all the probabilities to be positive, then no, that is the maximal entropy one, because it is the only one that satisfies the constraints. But let’s be open-minded and lift that requirement.

We have to talk about what the entropy of a negative probability is, because log isn’t defined there. The real part is perfectly well defined, and the imaginary part is multi-valued with period 2π. I’m not experienced enough with this stuff to make the right decision, so I’m blindly taking the real part for now and pretending the imaginary part is 0, since there’s really no reasonable “magnitude” it could be.

Whew, okay, almost to the fun stuff. We have four variables and three constraints, so we have only 1 degree of freedom, which is a lot easier to analyze than 4. We can express the distribution with only that one degree d as:

<d-1, 1-d, 1-d, d>

And here is a plot of the real part of the entropy as a function of d:

entropy

It achieves a maximum at d = 1/2, the distribution <-1/2, 1/2, 1/2, 1/2>, the same one Dan gave. In some sense, after observing that the first box is always 1 and, separately, that the second box is always 1, it is too biased to conclude that the output is always “1,1″.

I would like to patch up the “real part” hack in this argument. But more so, these exotic probability theories aren’t really doing it for me. I would like to understand what kinds of systems give rise to them (and how that means you must interpret probability). My current line of questioning: is the assumption that probabilities are always greater than 0 connected to the assumption that objects have an intensional identity?

I would love to hear comments about this!

Gravmari

Gravmari
Gravmari
Most of my time has been spent preparing a game for submission into the Independent Games Festival at the Game Developer’s conference. It’s called Gravmari, a game about gravity and the universe, roughly described as “Katamari in space”. The submission deadline is Nov 1, so we are working pretty hard to get it in tip-top shape. A demo version will go on the website shortly after that date.

I am very happy with the game so far. It is very polished, but remains simple and elegant. We have a lot of patience with our features, introducing new dynamics, music, and art gradually throughout the whole game. I am occasionally hit with the haunting sense of grand scale we are going for, and I think it will be very effective on people who don’t know what’s coming.

The other developer, Max, and I have informally started a studio together (we will register soon) called Hubris. This is our first game, and we hope to make many more. I am excited about this game company for two reasons:

  1. We are rejecting the typical working of the game industry and moving more toward the film industry, in the sense that our games have a director with a unified vision in mind, and the rest of the crew is there to help the director achieve his vision. For exampe, the idea of a “team” of game designers is pretty absurd in this context. Gravmari doesn’t have a director, but the soul is there: the game is a work of art as its whole — the execution of a unified vision — not an idea and a list of features.
  2. We are both scientifically-minded, principled individuals. A couple of indications of my little joys: Most game developers know that you shouldn’t hard-code numbers, but abstract them out into named tweakable parameters. But you can do better than that: do some math and discover what the right answer is, so tweaking doesn’t make any sense. Eg. you could have a tweak parameter for the speed of an orbiting ring galaxy, but better is to write down an equation and find out at what speed a stable orbit forms. This approach has worked wonders throughout this game. Another joy is that we fully embrace our 2-dimensional environment, so much so even to change the equation of gravitational attraction. In 2 dimensions, gravity ought to follow an inverse law, rather than an inverse square law (I can explain why later if someone asks). This has the side effect of creating much more math for us to do, because everything you find online applies to the inverse square form.

All in all, I think the game and the studio will be great. I’ll make another announcement when you can try it out.