16
Dec 11

hg adventure

Inspired by some silliness on #developers:

<jgilbert>	well that was an hg adventure
<dholbert>	$ hg adventure
You are in a twisty maze of passageways, all alike...
<cpeterson>	$ hg look
It is pitch black. You are likely to be eaten by a grue.
<hub>		$ hg doctor
How can I help you?

I thought I’d stick to actual hg commands, and came up with:

You see a small hole leading to a dark passageway.
820:21d40b86ae37$ echo "enter passageway" > action
820:21d40b86ae37$ hg commit
It is pitch black. You are likely to be eaten by a grue.
821:0121fb347e18$ echo "look" > action
821:0121fb347e18$ hg commit
** You have been eaten by a grue **
822:b09217a7bbc1$ hg backout 822
It is pitch black. You are likely to be eaten by a grue.
821:0121fb347e18$ hg backout 821
You see a small hole leading to a dark passageway.
820:21d40b86ae37$ echo "turn on flashlight" > action
820:21d40b86ae37$ hg commit
Your flashlight is now on.
824:44a4e4bf5f0e$ hg merge 821
Your light reveals a forking passageway leading north and south.

Kinda makes you think, huh? Time reversal games became popular semi-recently (eg Braid). Maybe the fad is over now; I’m *way* out of date.

But did any of them allow you to branch and merge? Push and pull from your friends’ distributed repos? Bisect to find the point where you unknowingly did something that prevented ever winning the game and either continue from there, merge a backout of that action, or create a new branch by splicing that action out?

It’s a whole new genre! It’ll be… um… fun.

(I’ll go back to work now)


03
Nov 11

Patch reordering

I have a patch queue that looks roughly like:

  initial-API
  consumer-1
  consumer-2
  unrelated
  consumer-3-plus-API-changes-and-consumer-1-and-2-updates-for-new-API

(So my base repo has a patch ‘initial-API-changes’ applied to it, followed by a patch ‘consumer-1’, etc.)

The idea is that I am working on a new API of some sort, and have a couple of independent consumers of that API. The first two are “done”, but when working on the 3rd, I realize that I need to make changes to or clean up the API that they’re all using. So I hack away, and end up with a patch that contains both consumer 3 plus some API changes, and to get it to compile I also update consumers 1 and 2 to accommodate the new changes. All of that is rolled up into a big hairball of a patch.

Now, what I want is:

  final-API
  consumer-1 (new API)
  consumer-2 (new API)
  unrelated
  consumer-3 (new API)

But how do I do that (using mq patches)? I can use qcrefresh+qnew to fairly easily get to:

  initial-API
  consumer-1 (old API)
  consumer-2 (old API)
  unrelated
  consumer-3 (new API)
  API-changes-plus-API-changes-for-consumers-1-and-2

or I could split out the consumer 1 & 2 API changes:

  initial-API
  consumer-1 (old API)
  consumer-2 (old API)
  unrelated
  consumer-3 (new API)
  API-changes
  consumer-2-API-changes
  consumer-1-API-changes

which theoretically I could qfold the consumer 1 and consumer 2 patches:

  initial-API
  consumer-1 (new API)
  consumer-2 (new API)
  unrelated
  consumer-3 (new API)
  API-changes

Unfortunately, consumer-1-API-changes collides with API-changes, so the fold will fail. It shouldn’t collide, really, but it does because part of the code to “register” consumer-1 with the new API happens to sit right alongside the API itself. Even worse, how do I “sink” the ‘API-changes’ patch down so I can fold it into initial-API to produce final-API? (Apologies for displaying my stacks upside-down from my terminology!) A naive qfold will only work if the API-changes stuff is separate from all the consumer-* patches.

My manual solution is to start with the initial queue:

  initial-API
  consumer-1 (old API)
  consumer-2 (old API)
  unrelated
  consumer-3-plus-API-changes-and-consumer-1-and-2-updates-for-new-API

and then use qcrefresh to rip the API changes and their effects on consumers 1 & 2 back out, leaving:

  initial-API
  consumer-1 (old API)
  consumer-2 (old API)
  unrelated
  API-changes-and-consumer-1-and-2-updates-for-new-API
  (in working directory) consumer-3 (new API)

I qrename/qmv the current patch to ‘api-change’ and qnew ‘consumer-3’ (its original name), cursing about how my commit messages are now on the wrong patch. Now I have

  initial-API
  consumer-1 (old API)
  consumer-2 (old API)
  unrelated
  api-change (API changes and consumer 1 and 2 updates for new API)
  consumer-3 (new API)

Now I know that ‘unrelated’ doesn’t touch any of the same files, so I can qgoto consumer-2 and qfold api-change safely, producing:

  initial-API
  consumer-1 (old API)
  consumer-2 (new API, but also with API change and consumer 1 updates)
  unrelated
  consumer-3 (new API)

I again qcrefresh,qmv,qnew to pull a reduced version of the api-change patch, giving:

  initial-API
  consumer-1 (old API)
  api-change (with API change and consumer 1 updates)
  consumer-2 (new API)
  unrelated
  consumer-3 (new API)

Repeat. I’m basically taking a combined patch and sinking it down towards its destination, carving off pieces to incorporate into patches as I pass them by. Now I have:

  initial-API
  api-change (with *only* the API change!)
  consumer-1 (new API)
  consumer-2 (new API)
  unrelated
  consumer-3 (new API)

and finally I can qfold api-change into initial-API, rename it to final-API, and have my desired result.

What a pain in the ass! Though the qcrefresh/qmv/qnew step is a lot better than what I’ve been doing up until now. Without qcrefresh, it would be

 % hg qrefresh -X .
 % hg qcrecord api-change
 % hg qnew consumer-n
 % hg qpop
 % hg qpop
 % hg qpop
 % hg qpush --move api-change
 % hg qpush --move consumer-n
 % hg qfold old-consumer-n

which admittedly preserves the change message from old-consumer-n, which is an advantage over my qcrefresh version.
Or alternatively: fold all of the patches together, and qcrecord until you have your desired final result. In this particular case, the ‘unrelated’ patch was a whole series of patches, and they weren’t unrelated enough to just trivially reorder them out of the way.

Without qcrecord, this is intensely painful, and probably involves hand-editing patch files.

My dream workflow would be to have qfold do the legwork: first scan through all intervening patches and grab out the portions of the folded patch that only modify nonconflicting files. Then try to get clever and do the same thing for the portions of the conflicted files that are independent. (The cleverness isn’t strictly necessary, but I’ve found that I end up selecting the same portions of my sinking patch over and over again, which gets old.) Then sink the patch as far as it will go before hitting a still-conflicting file, and open up the crecord UI to pull out just the parts that belong to the patch being folded (aka sunk). Repeat this for every intervening conflicting patch until the patch has sunk to its destination, then fold it in. If things get too hairy, then at any point abort the operation, leaving behind a half-sunk patch sitting next to the unmodified patch it conflicted with. (Alternatively, undo the entire operation, but since I keep my mq repo revision-controlled, I don’t care all that much.)

I originally wanted something that would do 3-way merges instead of the crecord UI invocations, but merges really want to move you “forward” to the final result of merging separate patches/lines of development. Here, I want to go backwards to a patch that, if merged, would produce the result I already have. So merge(base,base+A,base+B) -> base+AB which is the same as base+BA. From that, I could infer a B’ such that base+A+B’ is my merged base+AB, but that doesn’t do me any good.

In my case, I have base+A+B and want B” and A” such that base+B”+A” == base+A+B.

To anyone who made it this far: is there already an easy way to go about this? Is there something wrong with my development style that I get into these sorts of situations? In my case, I had already landed ‘initial-API’; please don’t tell me that the answer is that I always have to get the API right in the first place. Does anyone else get into this mess? (I can’t say I’ve run into this all that often, but it’s happened more than once or twice.)

I suppose if I had landed consumers 1 and 2, I would’ve just had to modify their uses of the API afterwards. So I could do that here, too. But reviews could tangle things up pretty easily — if a reviewer of consumer 1 or 2 notices the API uglinesses that I fixed for consumer 3, then landing the earlier consumers becomes dependent on landing consumer 3, which sucks. But also, none of this is really ready to land, and I’d like to iterate the API in my queue for a while with all the different consumers as test users, *without* lumping everything together into one massive patch.


24
Aug 11

hg qedit

On his blog, Paul O’Shannessy came up with an ‘hg qedit’ alias that opens up an editor on your .hg/patches/series file for reordering your patch queue. It’s a nice simple solution to a common problem, so obviously I felt compelled to muck it up.

Here’s my version, for insertion into your ~/.hgrc:

[alias]
qedit = !S=$(hg root)/.hg/patches/series; cp $S $S.bak && perl -pale 'BEGIN { chomp(@a = qx(hg qapplied -q)); die if $?; @a{@a}=(); }; s/^/# (applied) / if exists $a{$F[0]}' $S > $S.new && ${EDITOR-vim} $S.new && sed -e 's/^# .applied. //' $S.new > $S
                                                                                                                                                                                                                   # Did you see this by scrolling over?
                                                                                                                                                                                                                   # I want better code snippet support

This fixes the main problem with zpao’s solution, which is that it’s too clean and simple.

No, wait, that’s not a problem.

The problem is that when I edit my series file, I often forget that I have some patches applied and end up reordering applied patches, which makes a complete mess. The above alias opens up an editor on your series file, only it also inserts comments showing which patches are already applied. (If you really, really want to mess yourself up, go ahead and reorder the commented lines. You’ll get what you deserve.)

Here’s what my queue looks like when editing the series file:

# (applied) better-dtrace-probes
# (applied) try-enable-dtrace
# (applied) bug-650078-no-remote
bug-677985-callouts
bug-677949-gc-roots
hack-stackiter

Come to think of it, mq really shouldn’t let you mess up that way in the first place. It knows the original patch names for your applied patches (unless you are really determined to make your life difficult, and commit things on top without going through mq at all). It could detect when you reordered applied patches, and just undo what you did. And call you names. But maybe that would slow things down.

Update: it wasn’t working for jlebar, which turned out to be because he had added qapplied=-v to his [defaults] section. The above is now fixed for that scenario by adding a -q flag to hg qapplied.


31
May 11

More stupid mercurial tricks

I think I’m missing something. How do people get those changeset URLs to paste into bugs? Ok, if I’m landing on mozilla-central or a project branch, I just get it from tbpl since I’ll be staring at it anyway. But what about some other repo? Like, say, ssh://hg.mozilla.org/users/mstange_themasta.com/tinderboxpushlog?

As usual, I coded my way around the problem before asking the question, which is stupid and backwards. But just in case there really isn’t a good way, here’s my silly hackaround. Put this in the [alias] section of your ~/.hgrc then, after landing a change, do ‘hg urls -l 3’ or similar. (That’ll give you the latest 3 changesets):

  urls = !$HG log --template='{node|short} {desc|firstline}\n' ${HG_ARGS/urls /} | perl -lpe 'BEGIN { ($url = shift) =~ s/^\w+/http/ }; s!^(?=\w+)!$url/rev/!' `hg path default`

Picking that apart, it removes the misfeature that $HG_ARGS contains the command you’re running, then passes the remaining command line to hg log with a template set to just print out the changeset shorthash and the first line of the commit message. It sends that and the URL of the default upstream repo through a perl command that rewrites the hg log output to “/rev/ “. Oh, it changes the first part of the repo URL to http because my test case is actually an SSH url and there just happens to be an HTTP server at the same url.

A mess, but it works for me.

And yes, I should switch to a blog that isn’t hostile to code. Sorry about that line up there.