January 2013 Archives

Tue Jan 22 21:20:31 EST 2013

Deadlands

While poking around a second-hand shop, I found a copy of the Deadlands RPG. It pitches itself as a "Weird West" game: take the wild west, add horrible things from beyond and folks who won't stay dead, and you've got a pretty interesting place to tell a story. Mad scientists build steam-powered machines that run on "ghost rock", a strange mineral that appeared after "The Reckoning" let spirits back into the world (and blew up California). Inventing things channels said spirits, which turns the scientist insane. Hoyle's writings on card games are secretly books of eldritch lore, and if you look closely you can see the cards appear in a huckster's hand as he casts his hex. It's a really cool setting, and if I had the time and players, I'd watch a few spaghetti westerns, tinker with the mechanics and play a couple of sessions.

Mechanically, the game is a mix of the cool and the clumsy. When a huckster casts a hex, he makes his roll to cast anything at all. On a success, he draws 5 or more cards from a deck, depending on how well he rolled and makes the best possible poker hand from them. That determines how well the hex works. The initiative system is also pretty clever: the result of a speed check controls how many cards you draw, and then you turn in your cards to take actions as the GM counts down from Ace to Deuce. It's similar to the Shot Counter in Feng Shui, but with cards instead of numbers. In Feng Shui, players tend to forget their next shot number, so perhaps having cards will make it easier. I don't like initiative systems where fast characters get more actions than slow ones, because it takes the spotlight away from the slower ones.

On the clumsy side, there's a lot of randomness where I don't think there should be randomness. For example, when your character dies, you rolls to see if he becomes one of the Harrowed: reanimated (and sometimes possessed) by an evil spirit. Such a major change in the character's nature should not, IMHO, rest on a single roll of the dice. Similarly, the shaman's spell-casting system can require a considerable amount of character sacrifice (like removing fingers), with a chance of absolutely nothing happening. A character's starting stats depend on a hand of dealt cards, which might be fine for some old-school types, but that's not something I like in my RPGs these days.

So for me, it's really good idea fuel. I could see myself bolting some of its mechanics onto another game (notably the card-based spell-casting), but as a whole system its publication date really shows.


Posted by Jack Kelly | Permanent link | File under: readings, rpg

Tue Jan 22 19:28:00 EST 2013

The Haskell Toolchain: Clarifications

My recent rant briefly appeared on the front pages of HN and /r/haskell, to a largely negative response. Most of the comments either accused me of having autotools-Stockholm-syndrome or insulted my reasoning ability. Since I still stand by what I said, I will respond to those main criticisms.

Autotools-Stockholm-syndrome: this argument is basically "I could write a 20-line Makefile to build my code instead of learning the autotools". Yes, you probably can. Now suppose you want to add automatic dependency tracking, so when you compile foo.c into foo.o. gcc (and other compilers) can list any included .h files as extra dependencies for foo.o. Now suppose you want to support other compilers, more OSes than just GNU/Linux (or your MacBook). Now suppose you want to support cross-compilation. Now suppose you want to support the GNU Makefile Conventions, including the standard targets for users. A tool to automatically generate makefiles starts to sound pretty good. The traditional ./configure && make && sudo make install dance is pretty good for the user (CMake, for all its faults, wins here with its wonderful GUI) and a properly autotooled package is quite easy for distros to package. In the past, I've written hand-written Makefiles for specific projects (often internal ones that don't need an install step), and it is often the way to go. For free software projects written in a supported language, automake is an excellent (if warty) tool. You never know quite what users are going to want to do with your package.

Reasoning: The accusation here is that I'm appealing to tradition by saying that libtool is allowed to wrap the C compiler, but a new tool cannot. I maintain that wrapping the C compiler always was a bad idea to be avoided if possible. There's not much that can be done in libtool's case (especially when factoring in things like DLL support on Windows) but a pkg-config-ish tool for GHC really would help. There's nothing wrong with a convenience wrapper, but as Alan Kay says, "simple things should be simple, complex things should be possible". GHC's approach covers the "simple things" but not the "complex things".

Saying that "your new tools should play nice with the old tools" is not an appeal to tradition. It makes no assertion about the quality of the old tools, but simply addresses a practical concern. Look at how long it took for python 3 to get going because it tried to make a clean break from the past. Working in an ideal world is nice, but supporting the real world is better for getting things done.


Posted by Jack Kelly | Permanent link | File under: rants, coding

Sun Jan 6 09:49:58 EST 2013

Imprisoned by the Haskell Toolchain

I was going to write a new version of metscrape in C that supported plugins, so people could contribute modules for their local weather services. C is still one of the best ways to go for portable programs, and a plugin system means you don't need to build in support for unused countries.

But.

I wanted to write my plugins in Haskell, mainly because HXT is peerless when it comes to slicing and dicing XML.

So.

The Haskell FFI is pretty good. Actually, it's one of the best I've used: you declare functions "foreign export" and can fiddle around with things, preserve objects from the garbage collector and so on.

However.

The toolchain support is terrible. Basically, it wraps gcc: if you want to compile a mixed Haskell/C library (and you probably do, to expose the correct entry points, call hs_init() or other low-level details), you have to compile your .c files with ghc. Which means it won't play nice with automake or anything else that wants to use dependency tracking via -M or -MM. automake will want to invoke $(CC) to compile C files, possibly through libtool if you're building a shared library or module. Further, the link command doesn't even put in all the libraries that it needs, so you have to add them yourself.

Modern ghc supports shared libraries, but debian doesn't ship them, so you can't rely on the dynamic linker to sort it out for you.

There's no command akin to pkg-config to give you the right cflags/ldflags to pass to the compiler.

So basically you can't get proper dependency tracking or anything. Ugh. Have fun reimplementing all the required features in the GNU Makefile Standards.

The ultimate problem is that people insist on rolling their own sucky versions of build systems and package managers. (Though cabal and ghc --make suck less than most, I'll admit).

Lessons.

  1. IF YOU DIDN'T SET OUT TO WRITE A BUILD TOOL, DON'T WRITE YOUR OWN BUILD TOOL. Choose one or more of the following, instead:
    1. Emit make-format dependency information. UPDATE: It turns out ghc can do this. Unfortunately, it can't emit dependencies as a side-effect of compilation, which is what automake really likes.
    2. Emit C and write a suffix rule. TA-DA! You now play nice with the rest of the world. Once you've got the native-code backend going, you can emit dependency information (as above) without breaking the world. Use a sensible deprecation policy.
    3. Provide a foo-config script or program that will give you the correct compiler and linker flags.
  2. HAVE FLAGS SO YOU EMIT EXACTLY ONE OUTPUT FILE. This is a corollorary to the above point. make(1) is a dinosaur, but it's everywhere and you have to play nice with it. Don't spew out half-a-dozen files each time you call the compiler (ocamlc I'm looking at you) because then you have to be really careful with your make rules otherwise you'll break parallel make. A "do everything at once" mode is fine for use from an interactive shell, but this is the age of multicore: you don't get to break parallel builds to save a couple of compiler invocations.
  3. DON'T WRITE YOUR OWN PACKAGE MANAGER. If your code plays well with automake (it should. Write some autofoo to help find paths &c. It's not hard.), installing things is really easy. Every major distribution has stuff to streamline making distro-packages from autotooled packages. Want to install in a custom prefix? Let the package manager do it for you. (What? Your package manager sucks, and doesn't let you do this? Fix it, and everyone's ecosystem benefits!). Dishonourable mention: rubygems.

    When I passed around a draft of this post, one reviewer asked me if I seriously expected compiler writers to learn autotools to play nice with it? My answer is a resounding YES. The autotools aren't that hard to learn, and there's a fantastic tutorial to learn from. If you're smart enough to write a compiler, you're smart enough to learn how to make it play nice with the rest of the world.

  4. DON'T WRAP THE TOOLCHAIN. You're not the C compiler. You don't get to compile someone's C code. If lang1 and lang2 both wrap the toolchain and a developer is writing a lang1<->lang2 bridge, they're forced to use Nasty Hacks(tm) to make the wrappers play nice with each other. libtool gets a pass here because it's so entrenched and automake supports it natively. You're new. You don't get that excuse. Honourable mention: python. Its python-config script lets programs embedding the python interpreter build with correct flags.

Posted by Jack Kelly | Permanent link | File under: rants, coding

Wed Jan 2 23:09:58 EST 2013

King of the Derwent

Today was the 2013 King of the Derwent Yacht Race. Some years, the tall ships get their own race. There wasn't a race for us this year, but that wasn't going to stop us getting out on the water! We were handling sail for almost all of the day, trying to get the most out of the ship. By the time I'd noticed the fleet sailing towards us and got the camera from my cabin, they'd all rounded the mark and were sailing away. Even so, I did manage to get this lovely shot of Tasmania's other tall ship, Lady Nelson.

Lady Nelson

That was the last of our daysails from Waterman's Dock. Once the racing yachts and temporary fixtures are cleared out, we'll move back to our usual berth at Elizabeth Street. It's been an interesting couple of weeks. Because the berth is so narrow, it's easier to ferry the linesman in with the powerboat than try to have him jump ashore from the ship. I've been doing the boat driving, and it feels really good to be able to use my ticket. If the wind's unfavourable, I need to make like a tug and give the ship a shove. It must've looked pretty funny from the wharf, but it worked.


Posted by Jack Kelly | Permanent link | File under: windeward_bound