Monthly Archives: November 2008

Relative time FRP

Conal seems to have discovered an essential change in the semantics of FRP. The change is to think in terms of relative time instead of absolute time (Einstein would approve). It really cleans a lot of things in the semantics up, makes the implementation local (which functional programming is very good at making efficient).

So, for example, instead of Behavior a = Time -> a, we have Behavior a = Time -> a :-). The difference is simply that on the left, “now” is whatever time you give it, whereas on the right, “now” is 0.

However, this change has far reaching consequences on how I think about FRP. The difficulties I have been trying to overcome in my implementations no longer make any sense. This is a good sign.

I have been gradually hacking on a “practical” implementation of FRP, i.e. putting aside the beauty of the implementation and doing whatever I can to make it efficient and easy to use. But man it’s awful. I have no certainty whatsoever than the 200 lines I have so far written are anywhere near correct other than they seem to work when I play with them. I don’t like that feeling.

I have the urge to revisit my precise and rigorous implementations I had been working on, which I gave up on because they didn’t efficiently express what I wanted them to. In particular, this relative time thing with comonads seems very simple. So I want to write it together with my “correct blocking thunks”, which are thunks that are technically blocking, but they never actually do because you have to prove that they won’t before you evaluate them :-)

I’m pretty excited. I’m hopeful that this new FRP is expressive and powerful in addition to its beauty. I won’t know that until I rewire my brain a little.

Screw laziness (w.r.t. Fran semantics)

Reactive is getting more mature, but still does not support “feedback” behaviors (a bug), and still does not support true first-class Behaviors (an essential limitation); meaning you cannot efficiently put a Behavior in a data structure and save it for later use. By observing the reactive mailing list, this limitation is proving unimportant; still plenty of things are possible. But I am a stubborn, hubristic person, and I simply insist on Fran semantics (Behavior & Future) with first class Behaviors.

I have tried and I have tried, and I am quite convinced that the Fran semantics are not implementable lazily and efficiently in Haskell. I think I have come up with a garbage collector extension that would allow it, but I’m not going to pursue it. Knowing that such an extension exists is important to me, however, as I see hardware such as The Reduceron playing a big role in the future. The IO monad is not going to fly in a reduceronian world.

My impatience for a functional reactive game and UI library is growing. I could have made one in Fregl, but I became convinced that Fregl was too inefficient, and that I required weak-reference-free laziness to make it efficient. In retrospect, however, Fregl was inefficient for many more reasons than simply its eagerness. I am thus throwing aside my laziness requirement, and am now setting out to make a very efficient eager FRP library with Fran semantics.

Perl hackers know the three virtues: laziness, impatience, and hubris. I find it amusing that the one I’m having to throw away in a lazy language is laziness itself.

Restricted Data Types

With some spare boredom I had this morning, I was skimming the section on functors in RWH. A little ways down they introduce the type:

data Eq a => Bar a = Bar a
instance Functor Bar where
    fmap f (Bar a) = Bar (f a)

And point out how the constraint does not allow a valid functor instance. But what if it did?

What I’m getting at is that nobody ever uses constraints on data definitions like that, because they’re basically useless (AFAIK, they just impose the constraint on each constructor and nothing more). Perhaps they could work in this way: the mention of the type Bar a implies the constraint Eq a. Thus:

fmap :: (a -> b) -> Bar a -> Bar b

The caller already provides the dictionaries Eq a and Eq b for you, because the types “Bar a” and “Bar b” come from his environment. I’m not sure what I mean by environment there.

The time you would be required to provide a dictionary would be if you used a Bar in the body of a function without mentioning it in the type signature.

If this makes sense (I’m not convinced it does in the least), it might be a nice way to solve the class restriction problem.

Arizona Green Tea

I don’t know what it is. Coke doesn’t do it, Coffee doesn’t do it, regular hot green or black tea doesn’t do it. Arizona green tea inspires me to code like no other substance. If I am vegging in front of the TV and go to the fridge to grab one, hardly 5 minutes after drinking some of it I get bored by the TV, fire up my text editor, and hack for at least a few hours with intense focus.

What is the magic ingredient? I conjecture that it is memory; i.e. it used to be my fuel when I was a mad high school code monkey, so it brings back that state of mind. Or maybe they inject it with speed and ridalin.

Udon Sketch #2

The core library of Udon is basically finished (see my previous post for a sketch of what udon is). It needs cleanup, documentation, and a quickcheck suite, but the functionality is there and working in my small tests. I have begun udon-shell, which is a way you can interact with udon objects from the command line. Mostly it’s just a way to test things, but may become an integral helper tool for the more complex tools to come.

My favorite thing about the Udon core is that there is nary a mention of IO anywhere. It is a purely functional core, the way I was told real software is written when I was a Haskell child. :-)

So, where to go from here? I have two projects in mind, which may just be one project. One of them is to make a version control system which blurs the idea of a “project”, so I don’t care whether it’s one or two. The beautiful thing about this library is that I can model these projects in a purely functional way (as long as I don’t reference any functions in my persistent data structures), and the persistence and distributedness comes for free.

Here is how I picture the “cabal file” for the package manager.

type PackageName = [String]  -- list of components
type ImportList = [(PackageName, PackageName)] -- use module Foo, name it Bar
data Package = Package {
    name :: String,  -- whatever you want, no thought involved, rename at will
    depends :: [(ExtRef Package, ImportList)],
    ...
  }

Which is the essence. That packages are referred two by reference rather than by name, and modules from a package are explicitly imported and can be renamed to avoid conflicts. I’m not worrying about how this interacts with the state of the ghc art; it’ll work out.

I’ll come back to this in a bit. First I want to talk about smart versioning.

Smart versioning has little to do with Udon, but as long as I’m making a version control system I might as well play with it (though I doubt I have the attention span to carry it to its full potential). There are more types of content than just text and directories. For example, one might write a “Haskell source” plugin which understands Haskell source code, rather than just seeing it as a collection of lines. Then you could tell the plugin about the refactors you do, such as renaming a function. If that change is merged with a version from before that change, any true references to that function will get renamed (but shadows of that name would remain).

Taking this further, say there were a “Haskell package” content type. Then if you broke backward compatibility between two versions, you would annotate how. If it’s something simple like a renaming, the package manager could automatically upgrade your project to work with the new version. If it’s something complex like the semantics of a function, it could see if you ever used that function and mark the locations in your code that needed to be updated.

Such patch abstractions, which are of course custom depending on the content type, could be the basis for the version contol system. I think Darcs does this kind of thing too, I’m not sure. But I use git’s “object identity” theory rather than Darcs’s patch theory.

So, given smart versioning patches, what is a proper package spec?

I want to support:

  • A package allowed to depend on multiple versions of another, with forward compatibility as I just described.
  • Splitting a package into two.
  • Merging two packages into one.
  • Renaming stuff (of course).

A package can vary by any combination of its dependencies. My hunch is that there should be a different object for each combination of dependencies (together with versions). But we don’t store all of them. Instead we store a root package, presumably depending on whatever precise versions of dependencies it was developed with, and then derive the package for other sets of dependencies using package diffs. To ensure robustness, a test suite should be included in the package to make sure the derivations were correct (though diffs ought to be pretty sure of themselves to begin with).

As scary as it might seem, this is already better than cabal. With cabal, the package makes some wild declaration about which versions of its dependencies it can support. And it can lie. There is no automated test that checks whether it told the truth, there is no way to inform a package with dependencies too liberal that the semantics of a function changed while its type stayed the same (thus introducing subtle bugs). Basically for cabal to work, you should have checked a package against every dependency you declare. But nobody does that.

Also, what of the case where a package needs modification between two versions of one of its dependencies. You have to resort to gross conditional compilation stuff.

Hmm, a troublesome case comes to mind. Suppose the case I just described happens, where foo depends on bar. foo-A works with bar-A, and foo-B works with bar-B. Now I find a bug in foo. I don’t want to have to fix it twice. So let’s say I fix it in foo-A. What happens to foo-B? How does it get the fix.

Well, this is a purely functional package manager, so I can’t really fix it in foo-A. So let’s say I fix foo-A and it becomes foo-A’. Then what I need to do to foo-B to fix it is apply the patch foo-A’/foo-A, to get foo-B’. Hmm, I guess that’s it. It will be tricky for a package author to keep this dependency information coherent for other packages to use.

Okay, I need to get to bed. Final sketch: packages do not refer to other versions of themselves. They stand alone, referring to their dependencies by exact version. Then edges (noodles?) are published, together with a possible “upgrade direction” (you always want the target of this edge if you can). Then some clever algorithm tries to upgrade the package and packages that depend on it. If one of the dependents fails, you can try it with an appropriate edge for the dependent (if one exists).

Hmm, a nice thing this solves is the ability to resolve (to some extent) that horrid dependency on multiple versions of the same package, by renaming one of the versions to something else. Then it is apparent and resolvable, at least, when you are treating two types from different versions as equal when they are not.

Sketch of Udon (Version Control/Packaging system)

I’m kind of tired, so don’t expect this post to be entirely coherent. I just wanted to write a little bit about the project I started recently, called Udon.

The idea came to me as the common ground between several problems I have seen recently. I will not go into the intricate details of why I consider them problems worth addressing. They are:

  • The namespace problem on Hackage, CPAN, and related module collections. Specifically, ridding ourselves of the rigid, arbitrary taxonomy of packages without the root namespace getting uselessly polluted.
  • The fact that no popular DVCS allows moving files between projects while preserving history in a sane way.
  • The desire for a purely functional operating system.

Incidentally, these are programming culture problems #2,#3, and #4 on my list (#1 is, of course, FRP). I’m glad I took a break from FRP so I could see this common ground.

Udon stands for “Universal Distributed Object Notation”, with a tip o’ the hat to JSON. And yes, it is a system for serializing objects. It is a gigantic distributed store for inter-referential stateless objects, and would support a huge distributed data structure with no single computer containing all of it.

I’ll get to the implementation details in a minute. I’ll first address how it helps to solve the problems above.

It will help in the implementation of a DVCS that does support moving files between projects, by more or less eliminating the idea of a project. The versioning is all in commit objects, which refer to changes in some tree data structure (representing the directory tree of the project). This is how git works. The difference is that there is no restriction about what tree this is; it could be a tree “from another project”. Just picture how you would write git as pure Haskell, never having to touch the filesystem or anything, everything being objects in memory.

This DVCS, or something similar, helps to solve the namespace problem. A project could refer to the modules it uses by identity rather than name. So instead of the cabal file saying I depend on “binary”, the cabal file (well, data structure in the new model) would say I depend on “that thing over there”, and then import the modules from that package using an advanced import mechanism, which allows renaming or qualifying modules, etc. to avoid collisions.

After conceiving the idea, I realized that it is essentially the filesystem for a typed, purely functional operating system, which is another one of my fancies.

Okay, that’s probably more than enough vagueness for you math types.

The implementation is based on a flawed concept; the same flawed concept that all the DVCSes use, that the hash of an object is a unique identifier for that object. Basically we pretend that hash collisions don’t exist. Yuck. But I could not think of any other way to do it. (This may be the thing that prevents me from going very far with this project, since I am getting in the habit of proving my code correct, and I will never be able to do that with this code since it is not correct).

There is a special type called ExtRef. ExtRef a is semantically just a value of type a. The kicker is that it might be on someone else’s machine, on your usb drive, or even lost forever because nobody kept its data around. It is represented as data ExtRef a = ExtRef Hash (Maybe a). Hash is the “unique identifier” for the object, and you always have that. You might have an actual copy of the object too, but if not, the Hash is there so you can find it if you need it. This representation, and the whole Hashing idea, is hidden from the high-level interface, of course.

You work with ExtRefs through the External monad. It’s a coroutine monad (free monad over i × (o -> •)) which reports which ExtRefs need their associated objects to continue the computation. The low-level interface will allow implementation of a variety of ways to handle this. The simplest one is a local store of objects to be queried. As things get more mature, that could be expanded into one that looks at various remote network stores (based on some origin map or something), or asks the user where to look, etc.

And my hope is that the little Udon framework I’ve described here will be sufficient to model the DVCS and the hackage package system as if everything were just pure data structures in memory. Of course it won’t be quite as convenient (you need to annotate your data structures with some ExtRefs at least), but it would at least encourage pure modeling of these “real world” systems.

Captain Update

Here’s a quick update about what’s been going on in my life and in my head.

I’ve started a few brainstorming blog posts, but failed to get anywhere so I didn’t post them.

The first is the fall of taxonomy, explaining the copious problems with organizing a large volume of modules by hierarchical, taxonomic namespace, and a solution I have in mind involving a type of version control that doesn’t yet exist (a DVCS which blurs the distinction between separate projects). 

The second is semantics of computational complexity, wherein I rant about the necessity of thinking operationally (after defining what it means to think operationally) to reason about computational complexity.  I hoped to make some headway toward a more local solution, a “computational complexity type system”, but didn’t really get anywhere meaningful.

The FRP as attribute grammar algorithm flopped, and my confidence has been lowered that my goal is actually achievable at all.  To be clear, my goal is an efficient, lazy implementation of the Fran semantics.  The kicker was the following snippet:

kicker startEvent input someKey = pure 0 `until` do
        k <- someKey
        return $ if k == 'a' then a else b
    where
    a = pure 0 `until` (integral input `snapshot_` startEvent)
    b = pure 0 `until` (integral ((2*) <$> input) `snapshot_` startEvent)

The main idea for the algorithm was that it took a list of times to compute the main result and propagated that list into lists of times to compute for its constituents. But in this example, we do not have the luxury of knowing what constituent we are talking about until after someKey, but the integral depends on values of input from before startKey, and not in any way that is transparent to the implementation. I’ve been roughly developing a principle of FRP which is something like “information locality” which is necessary for efficient implementations, and this example looks like it violates it. My next goal is to formalize this principle and show how the Fran semantics do not conform, thus giving some mathematical indication why implementations are so elusive. I hope this will guide me to the most flexible semantics which do obey the principle.

Meanwhile, I’m idly hacking on a text adventure CU GameDev is making in python. It has been such a long time since I did imperative programming, it’s hard to get back into. So hard, in fact, that I did not. My contribution was a purely functional parser combinator library in the style of ReadP, for dynamically building sentence parsers based on context. Python’s essential support for functional idioms is not bad, actually, it’s just the syntax which makes it kind of a pain.

I haven’t been doing much music lately. Of course I am still improvising daily, but I don’t have a band going at the moment (mostly because I have been unsure whether I’m leaving the country). On Friday there is a big jam session with about 9 musicians scheduled. That will be a challenge, but if done well by all parties involved, it could be really cool. Unfortunately my recording gear does not work with Vista, so there will be no high-quality recording (Nolan will use his portable room mic instead).

And I might have a date with Karen on Saturday. By which I mean we’re going out to dinner together. I have no idea whether she considers it a date. I’m attempting to let it go and not worry about what it’s called or what it implies in the future. I can’t help being unreasonably excited, however. And you know what Obama says: “the bubble sort would be the wrong way to go”.

Karen The Strange

Alright, well I am convinced that nobody close enough to this group to care actually reads my blog, so I can write freely.  This is in the style of most of my personal posts: a reflective narrative.  I don’t know why I told you that.

The Two-Day Tragedy

Last Tuesday my fellow musician and friend Devon opened for the Andreas Kapsalis Trio, a fantastic band, at a local coffee house.  Tuesday is the night of GameDev, and in particular we were playing Werewolf, one of my favorite games.  But I thought I could miss out on some of my friends to support another one of my friends, since it was far more likely important to Devon that I was there than to GameDev.  And also Devon is a pretty freaking talented musician, and I love listening to his stuff.  

It was a great show all around.  The whole “The Moment” band came to hear him, and we scheduled a jam session for the following night.  It was about 11:00 at night at this point.

Nolan offered me a ride home and we started packing up, buying Andreas Kapsalis CDs, doddling around.  A (quite cute) girl there kept looking at me and smiling as we were shuffling about.  I decided to follow her outside, and I made a joking comment on whatever conversation she and her friend were having.  She immediately asked “do you want some pizza?”  To the half-confused look on my face, she clarified that they were going to a restaurant to grab some pizza and she was inviting me. 

Great!  I went with them, texting Nolan a short blurb about ditching him.  The pizza place was closed, but the four of us meanwhile had a really fun walk up and down the mall.  Her name is Jesse.  In search of food, Jesse, Joshua and I (we lost #4: Gabriel) headed to silvermine subs for some sandwitches.

The three of us proceeded to talk until 3:00, while it became clear that Jesse liked me quite a bit, and not so much Joshua (who was “courting” her).  People get really funny (read: annoying) when they feel themselves losing hold of the situation.  Do I do that?  Am I that obvious?

We stepped outside as silvermine closed, Joshua continued to talk about how great he was as Jesse and I rolled our eyes at each other.  Eventually he left and Jesse and I walked together for the intersection of our paths home.  She was cute, intelligent, and really fun to be around.

I walked the remainder of the way home, about two miles, with a big smile on my face.  At the very least, this will get my mind off of Karen.

The jam session on Wednesday was cancelled because musicians are flakes (surprise!).  Looking for something to do, I texted Jesse to see if she wanted to hang out.  “Sure!”  It took a while to get the logistics organized, but we eventually met for pizza at the place we were planning to go the previous day.  We had a reasonably good time, until we headed to the coffee shop where we met and who was there but our old friend Joshua.  He predictably clinged on to us for the rest of the night, which was a bit of a downer.

But that is not the downer.  The downer was that I found out Jesse had just gotten out of a year-long relationship in a rather violent manner, and she was pretty sensitive about it.  Everything reminded her of him.  At one point she got really quiet and just stood there.  Jumping at the opportunity, Joshua hugged her for about two minutes to comfort her (this did not work – it clearly made her uncomfortable).  He went off briefly to make a phone call or something.

While he was out, she and I just stood there.  I stood ten feet away from her, arms across my chest – to protect it.  Silence thickly padded the space between us, lasting hundreds of heartbeats.  

In a voice that I could hardly hear myself, slow as each second, I asked, “It feels like you want some time by yourself.  Should I go?”

“If you want to,” she said with tears in her voice.

“I don’t … really … want to go, but … … but, you know what I’m <mumble>”

“It’s up to you.”

More time struggled to pass in the silence.

“Well, okay, have a nice night,” I whispered uneasily.

“So you’re going?”

I sighed.  “Yeah.  See you around?” as I looked up into her eyes.

“Yeah.  Keep in contact.” 

We smiled at each other, and I began the six mile walk home.  Hardly one hundred steps later, I began regretting my decision to leave.  I sent an apologetic text message – impulsive fingers are a weakness of mine – walked indecisively in circles for a while, and finally decided to continue home.   It was cold.  I called a cab, and curled up in a little ball while I waited for it.

When I got home, I searched for a movie to match my mood.  Requiem for a Dream, that’ll hit the spot.

I haven’t seen Jesse again.  Much as I liked her, I don’t want to be in that kind of situation – it’s a replay of last year with Stephanie, when I futilely hung on much longer.

A New Old Friend

On Thursday, Karen came over to prepare for the CU International Halloween party she was hosting at our house (since it has a nice yard).  Just as a way of hanging out with her, I helped her prepare some things, chatting about this and that.  We searched for traditional Halloween recipes to show the international students, put up lights, cobwebs, and jack-o-lanterns.  I could feel her love for this party: she really wanted to make it a great American experience for the foreigners.  I was compelled to sympathize: we shopped together, cooked together, brianstormed together.  I cared as much about the party as she did.

That night after she had gone home, my impuslive fingers instant messaged her: “hey — I just wanted to say, you’re turning out to be a pretty great friend. :-)”

“aww thank you =)”

She came back the following day to continue, and I continued.  At 6:00 when the party started, she ran off to the bathroom for half an hour to prepare her costume, when there was still much to be done.  I picked up the slack, doing everything I could to ensure that our guests had a good time at my own expense.  I have never done this at any party before.  When she came out, I continued, as she did this and that and interacted with the guests.  

By the end of the night when most of the guest-guests had left and mostly just our little social group was left over, I just sat by the wall, exhausted.  I could see it in Karen’s eyes: she loved the party, and she appreciated what I did for it.  A feeling of selfless euphoria engulfed me.

We grew a lot closer over those two days.  I feel trusted now.  Today she came over to “clean” – we did a little, but mostly we just hung out most the day, joined by two other friends later on.

I feel very different about her.  Previously infatuated, my feeling has become something much closer to a genuine love, like how I feel about Karlin.  But it did not replace my infatuation, just bosonically superposed it.  It sums to a tragic, emotionally confusing dilemma.  I love her like – I don’t know how to say it – like Karlin, like my closest friend.  I want her to be happy and comfortable at all costs.  But also, I look into her eyes and my heart rate triples and my body floods with an incredible relaxing tonic.  I want to barely touch her hand with all the love in my soul, I want to kiss her, I want to roll around on the floor with her.  

At least as far as history speaks, these two desires are at odds with each other.  I have the impression she thinks of me now as a close friend, but not more.  I know in my heart that my first desire – my concern for her life irrespective of my own – wins if the two must duke it out.  But I beg them not to.

The F Word

I hate thinking about The Future.  I am present tense.  I can’t see any reasonable resolution that ends happily for me.  She is in my dreams, and I am happy while I sleep.  Reality slaps me sober in the morning.  I think of her and I smile.  Then I think of myself and I stop.  It is God’s pavlovian Buddhist training.

The end of this story is where I become a great musician.  When I crawl heartbroken into my cave, fingers a-blasing.  That is the upside of music.  Emotions, both euphoric and dismal, are fuel.  The greater their number and strength, the more beautiful the art they power.  If its divine duality didn’t bless me, the imbalance in my life would drive me insane.