First, Some Backstory

When I first started getting serious about Unix around the turn of the century I got really hard into Vim. In 2003 I had already had my first glimpse of RSI so I switched to the Dvorak keyboard layout and bought myself a beautiful Kinesis Ergo keyboard.

For some reason I decided this would be a good time to switch to Emacs, so I did. I never really achieved the fluence I had with Vim after switching from Emacs (though I did lose what I had with Vim). So I’ve slowly grown to resent Emacs over the years, and I made a point of trying out new editors pretty frequently.

Most recently this culminated in trying out Textadept.

Let me take a break here to make something really clear.

Textadept Rules

No, seriously. It does. Everyone should try it. It has a console mode, sensible modern keybindings, and it’s scripted in freaking Lua. It’s really awesome. Use it. Love it.

So it’s a real shame that I continue this post by telling you how I set up Emacs and how amazing it is, finally, after basically a decade of ignoring it, to actually use it.

Emacs Basics

My life with Emacs utterly sucked until I found out about the following settings. I’m going to explain them as I go.

If you don’t set this Emacs may think you’re some kind of ISO-9660-1 jackass.

(prefer-coding-system 'utf-8)

While learning Emacs, it’s going to beep at you a lot. This is infuriating. Disable the bell.

(setq visible-bell t)

Emacs by default thinks that you use the computer as if you emerged from some kind of 1970s typing class and your instructor took points off if you put a single space after the period. But it’s 2013, and nobody does this, and nobody should, because everything that matters either ignores the extra space (HTML) or typesets properly despite it (TeX). So fix it.

(setq sentence-end-double-space nil)

Impossibly, it will tell you the current line but not the current column by default. Fix it.

(setq column-number-mode t)

Emacs loves to tell you that it is Emacs and how proud you should feel for using it, so you actually have to modify bunches of settings to really make it clear to Emacs that you just don’t care about it or its feelings.

(setq inhibit-startup-message 't)
(setq initial-scratch-message "")
(setq inhibit-splash-screen t)

Emacs defaults, in a depressingly predictable manner, to Lisp mode, as if that’s something you give a shit about. Let’s make it sensible:

(setq default-major-mode 'text-mode)
(setq initial-major-mode 'text-mode)

Of course you could put a language mode in there if you mostly edit a particular language, but I find I usually have no idea what I’m up to when I start it up so text mode is more honest. Fundamental mode is even more useless than Lisp mode to me.

Now let’s disable those hideous backup files.

(setq make-backup-files nil)

Emacs by default is configured to be some kind of idiot savant about indentation, switching between tabs and spaces as necessary to achieve the layout of your dreams. In practice nobody wants this ever, so shut it down:

(setq indent-tabs-mode nil)
(setq tab-width 4)

Now global key bindings are a treacherous territory but I think this one is completely obvious. Every other editor on earth proudly advertises that it will indent properly by doing it after a newline, so Emacs should too:

(global-set-key (kbd "RET") 'newline-and-indent)

Finally, because Emacs has more keybindings than you can ever hope to master, let it tell you when it has a quicker way to do something than the M-x long-ass-name abomination you just used:

(setq suggest-key-bindings t)

A setup trick

So it happens that I have a fairly complex configuration now, but most of my settings are either host-specific or mode-specific. My ~/.emacs.d/init.el is actually very short:

;; load all my initialization files
(dolist (file (file-expand-wildcards "~/.emacs.d/init/*.el"))
  (load file))

;; load custom per-host files
(let ((host-file (format "~/.emacs.d/hosts/%s.el" system-name)))
  (setq custom-file host-file)
  (if (file-exists-p host-file)
      (load host-file)))

This file is really establishing two naming conventions: one for all my mode-specific crap and one for all my host-specific crap. The custom-file trick there ensures that if I make the mistake of using M-x customize the changes wind up cordoned off in the per-host file.

I’m sure other people have equally clever tricks, but this lets me use one repository for one Emacs configuration that varies slightly on each host. This is super handy because normally I’d make a small change to, say, the work configuration and then lose it on the way home or, possibly worse, go home and manually re-apply it elsewhere, slightly differently. Now by simply choosing the right file I can shared things that should be shared or I can sequester changes off in the per-host file that I don’t want shared.

Another quick tip

Hey! Refuse to use Emacs < 24.3. Just say no. It’s bad enough you’re learning Emacs, dealing with lots of different Emacses would be even worse.

Packages and Themes

Be sure to use the new package system for your own sanity:

(package-initialize)

This lets you install stuff directly from Emacs. Give it a shot:

M-x package-install RET auto-complete RET

Tricked you! You just installed magic Emacs tab completion! Now add this to your ~/.emacs.d/init/autocomp.el file, or whatever:

(add-hook 'after-init-hook
  (lambda ()
    (require 'auto-complete-config)
    (ac-config-default)))

Restart Emacs and enjoy the fact that you now have freakin’ tab completion in there.

Oh, and get you a proper theme: M-x load-theme RET misterioso. On the house. Or you could go get solarized with package-install.

mu4e

Your next question is going to be, but Dan, why should I use Emacs if it isn’t even a decent MUA?

Well my friend, it actually is a great MUA, using the elaborately-named mu4e. This mailer has the #1 most-needed feature of any mailer, which is that it is search-based. I could write at great length about how nice it has been, using mu4e, but you should just try it out yourself. It’s really nice. Tab completes addresses, ido-like folder choosing, and fast as hell. Emacs 24+ can connect directly to your SMTP host, so there’s no excuse there. Set up offlineimap to get your mail and get on with your day.

This can’t really be why you switched

No, and it isn’t. Unfortunately, nxml-mode is. It’s really fantastic at editing XML. I’m sure you didn’t want to hear that. I recently had need for editing some Docbook. Docbook is a ferocious XML format designed to kill your hands. However, Emacs makes it a lot more tolerable with nxml-mode, because while you’re working on your XML file it’s constantly telling you just how invalid your document is. And it has some shortcuts for inserting partial tags and completing tags. And since it understands RelaxNG, it also will make autocomplete suggestions for you.

It’s worth noting that it really does understand RelaxNG. You can design your own schema from scratch and it will give you context-sensitive completions.

Misc

I’m just starting to use org-mode so I can’t give you a proper endorsement there, but suffice to say, it’s really handy to be able to make links to email messages and files on your filesystem to embed in your todo list.

Conclusion

So there you have it.

Note that this is not an endorsement of Emacs. Please go on using Vim or Textadept as you please.

May 7, 2013






The old ORM chestnut is back and we’re seeing the usual mixture of defenders and aggressors. But why is there such a divide?

The argument is fundamentally about choosing what’s in charge of your system, the system being composed of your databases, your applications, and your supporting infrastructure (your scripts, your migrations, etc.) To relational database folk such as myself, the central authority is the database, and our principal interests are what ACID exists to provide: concurrent, isolated, atomic transactions that cannot be lost, on top of a well-defined schema with strong data validity guarantees. To us, what’s most important is the data, so everything else must serve that end: the data must always be valid and meaningful and flexible to query.

The side that argues for ORM has chosen the application, the codebase, to be in charge. The central authority is the code because all the data must ultimately enter or exit through the code, and the code has more flexible abstractions and better reuse characteristics.

It comes down to disagreement about what a database is about. To the OO programmer, strong validation is part of the behavior of the objects in a system: the objects are data and behavior, so they should know what makes them valid. So the OO perspective is that the objects are reality and the database is just the persistence mechanism. It doesn’t matter much to the programmer how the data is stored, it’s that the data is stored, and it just happens that nowadays we use relational databases. This is the perspective that sees SQL is an annoying middle layer between the storage and the objects.

To the relational database person, the database is what is real, and the objects are mostly irrelevant. We want the database to enforce validity because there will always be tools outside the OO library that need to access the database and we don’t want those tools to screw up the data. To us, screwing up the data is far worse than making development a little less convenient. We see SQL not as primarily a transport between the reality of the code and some kind of meaningless storage mechanism, but rather as a general purpose data restructuring tool. Most any page on most websites can be generated with just a small handful of queries if you know how to write them to properly filter, summarize and restructure the data. We see SQL as a tremendously powerful tool for everyday tasks—not as a burdensome way of inserting and retrieving records, and not as some kind of vehicle reserved for performance optimization.

At the end of the day, we need both perspectives. If the code is tedious and unpleasant to write, it won’t be written correctly. The code must be written—the database absolutely should not be running a web server and servicing clients directly. OOP is still the dominant programming methodology, and for good reasons, but data encapsulation stands at odds with proper database design. But people who ignore data validity are eventually bitten by consistency problems. OODBs have failed to take off for a variety of reasons, but one that can’t be easily discounted is that they are almost always tied to one or two languages, which makes it very hard to do the kind of scripting and reporting that invariably crops up with long-lived data. What starts out as application-specific data almost invariably becomes central to the organization with many clients written in many different languages and with many different frameworks.

ORM is destined to be hated, because the people who love databases don’t love the practical albeit leaky abstraction offered by ORM, and people who hate databases will resent how much effort the better ones require to be used properly.

Some thought experiments

Consider this hypothetical scenario. You run Facebook, and you have all the software and all the data. A catastrophe occurs and you lose everything, and due to a mistake in the way backups were made, you can choose to restore the software or the data, but not both. (Somehow, restoring the software will destroy the data, and restoring the data will destroy the software). Which one do you choose?

Of course, this scenario is unlikely, but it should serve to demonstrate that to the rest of the organization (the non-programmers), the code is secondary to the data it manages. You can rewrite the code, but you can’t always recreate the data. When you look at our database backup, migration and refactoring tools and compare them to what we have for source code management, it’s clear that we spend more time worrying about the code than the data. That’s not inappropriate (we work on code all day) but it can lead to a myopic view of the importance of the data.

Another thought experiment posed by jshen on HN points out that data validity is secondary to monetization, and that if the business finds a way to increase monetization while causing a low rate of data corruption, it may be worth sacrificing validity. This is a fair point, and I think this illustrates why NoSQL is a winning proposition for many companies. If scalability and speed are more valuable then they can represent a better choice—especially if the data is cheap or can be recreated without too much trouble (or on demand).

(This article was derived from this comment I left on HN).

May 15, 2012






The cat’s out of the bag; the reason why there are fewer women in computer science has finally gotten out. And what a predictable reason: rampant, uncontrolled, pervasive sexism. The only thing surprising to me about this is the sheer number of episodes. Something like four incidents leading to news stories on high-profile sites like Reddit and Hacker News.

Pleasantly, there have been a lot of people saying the right things. Predictably, there have been a few trolls. But one nagging element that seems to keep coming up is: but what if this is just part of our corporate culture? What if you’ve got a really free-speech, open-air environment in which dynamic people get up and speak their minds, aren’t you going to somehow destroy our beautiful and unique culture by constraining us?

No.

And I’ll give you three reasons, because it takes me three pages to say what James Hague says in three sentences.

1. You are programmers

What makes you programmers? That you program. That’s it. Your biases have nothing to do with it. Since you’re a programmer, you probably feel like everything about you is rational and can be argued and defended. Not so. And because you’re a programmer, the reason you come to work is—to program, not to tell dick jokes, not to see which one of you can make the rest of the world the most uncomfortable. If you had to choose between actually programming and telling dick jokes, and you’d honestly choose telling dick jokes, or somehow argue that your ability to program would be compromised by not being able to tell dick jokes—newsflash, you’re not a programmer.

2. Your workplace isn’t a frat

Contrary to what you must have dreamt up with your buddies, a workplace is a workplace, startup or not. The appropriate time to discover if somebody fits with your corporate culture” is the interview. Not six months later, in the middle of a big project, when you finally let your guard down and start flying your freak flag. Your workplace is governed by the same laws that govern every other workplace, be it an office, the DMV, the doctor’s office, whatever. There is no bu-bu-but we’re a magical startup!” clause. If you’re making life hard on an employee of yours with your environment, that’s a hostile environment, period. So when someone says to you or your supervisor, hey, you need to back off on the dick jokes, that means you have to back off the dick jokes, not whine about how you were misunderstood or that you mean it ironically or that someone should have brought it up with you directly.

3. Freedom of speech is not relevant to programming

I love America, and I love freedom of speech, and (obviously) I make wide use of it. You are absolutely permitted to make all the dick jokes you want. You just have to not make them in front of other employees. If you need to be able to do that to write your code, you are not a programmer. Opine and whine all you like—just not while your coworkers are trying to do their jobs.

An Upheaval of Sorts

I don’t know anyone outside a family-owned business or programming who actually gets away with the kinds of unprofessional bullshit we do. It’s completely absurd, and I’m happy to see it improve. Let’s stop being complacent. Let’s choose to be professionals in this field because we love what we do, rather than choosing to kill our own profession by keeping it uninhabitable.

March 29, 2012






James Hague wrote a wonderful, vaguely controversial essay Free your technical aesthetic from the 1970s.” I’m refusing to comply, for two reasons:

  1. Those who forget the past are doomed to repeat it
  2. I hate unnecessary optimism

I think James would agree with the notion that it is too easy to get complacent. Computing, after all, did not emerge from Babbage’s garage with an Aqua look n feel. The road leading to today had a lot of bends, twists, blind drops and dead alleys. James is more interested in where computing will be tomorrow. I respect that. For whatever reason, I am much more interested in where computing was yesterday.

Aside on textbook pipe usage

It’s easy enough to debunk his attitude toward Linux. There’s plenty of interesting stuff going on. Linux is hard to love as a Unix, which is one reason I promote BSD. If the goal is small commands that you glue together like plumbing, it helps if the commands are actually small. I find it tremendously useful to be able to pipe commands around, not a textbook exercise at all. For example, I’ve done all these in recent ~/.history:

ps auxw | grep firefox
awk -F, '{ print $2 }' /tmp/projects.txt  | sort -u > legacy-pcodes.txt
grep 'Hibernate: ' precaching.txt | sed 's/.*\(from [^ ]*\).*/\1/g' | sort | uniq -s
xzcat mcsql-widarmon.sql.xz  | pv > /dev/null

Some of these are mundane (finding Firefox’s pid so I can kill it); others are a bit more interesting. The second one there is a bit of very light CSV parsing, the third one is showing me what Hibernate is spending its time doing, the fourth one there is letting me see the uncompressed size of an xz file. Probably there are better ways to do some of these, but after so many years of Unix, it’s just faster and easier for me to do a one-off like this than spend 30 seconds consulting the documentation. Heck, even when I do consult the documentation, it sometimes looks like this:

curl --help | grep -i header

Yup, I didn’t want to have to read through all those options. Do these look like textbook usages of the pipe? I have no idea, because I’ve never seen a textbook of Unix pipe usage. All I know is what I’ve seen in various blogs and what I’ve seen my friends do. It came naturally, I guess.

Obscure alternatives

Yet, even BSD is not so small anymore. Using Perl to damn Linux is also a bit of preaching from the porn. Pike, after all, was one of the greats behind Plan 9, current record holder for world’s weirdest and most obscure operating system you can run on a modern computer. The dream of Unix is alive in Plan 9, so to speak.

Plan 9 is very much the answer to the question, what would Unix be like if it could be rewritten with no regard for backwards compatibility? Everything is a file, all the old stupid file formats are made a little more sane, and so forth. You can’t do this anywhere else:

  • cat /dev/keyboard: it’s what you’re typing right now
  • cat /dev/mouse: it’s screen coordinates for your mouse cursor
  • the display is a flat file, the window manager arranges for each window to think it owns it
  • cat wave audio > /dev/audio to play it

These may not seem like much, but in Plan 9:

  • any file can be transparently mounted across the network to any other location: if I want your mouse to control my screen, we can mount your mouse over mine in the filesystem; similarly for all my other hardware
  • to use the network gateway, mount its network card over yours
  • want to write a network server? use the 9p library and instantly get remote access to your apps internals

Obviously it isn’t all roses, and I think James and lots of other people would blame the aesthetics” of Plan 9 for a chunk of that, maybe even the main chunk of that, but we must not let advances be forgotten just because they don’t pan out.

Forgotten futures

Between 1968 and 1970, there was a fantastic AI demo at MIT, called SHRDLU. When this program ran it displayed some shapes in a small 3D world. It then allowed the user to move the blocks around and create shapes as well as make queries.

When I write, allowed the user to move the blocks around” it’s easy to imagine that I mean the user clicked on a block and moved it somewhere else. That’s not what happened though. Instead the user wrote complete English sentences which then resulted in SHRDLU creating a plan of action which it would then execute by moving things! On top of that, the queries were also complete English sentences! This was 45 years ago!!

Person: Is there anything which is bigger than every pyramid but is not as wide as the thing that supports it?
Computer: Yes, the blue block.
Person: Does a steeple
Computer interrupting: Sorry, I don't know the word "steeple".
Person: A "steeple" is a stack which contains two green cubes and a pyramid.
Computer: I understand.
Person: Are there any steeples now?
Computer: No.
Person: Build one.
Computer: OK. (does it)

This is not a fantasy, this is an actual program with actual source code you can download today. Look at everything that happened there:

  • Someone gave a very complex query and got an exact answer
  • They then defined a new term declaratively
  • They asked the computer to figure out how to do something based on that definition—which it then actually did.

I have a hard time understanding why people are not shocked and amazed constantly at stuff that we used to do. I have an app on my iPhone called Shazam, which is easily the most amazing app I have ever had on my phone and the handiest. What it does is, I click a button, it listens for 15 seconds and then tells me what song is playing in the background. This was easily the most convincing piece of artificial intelligence anybody had on their phone for about four years.

SHRDLU took a declarative definition from plain English and converted it into a plan of action, which it then executed. Hmm… sound familiar? Sound like, I dunno, Siri?

Amazingly, SHRDLU comes in at under 20K lines of code. No corpus, no statistics.

Prolog versus the world

I imagine that by virtue of having taken a compilers class I’m substantially ahead of the curve. In that class, we started out talking about regular languages, then context free languages, then briefly we talked about context sensitive languages. The general attitude was, these things are hard to do, so we’re not going to do them, and we retreated back to context free languages, one in particular (Tiger) which we implemented for the rest of the semester, and that was that.

The general attitude this instilled is, well, you can do amazing things with context free languages, and that’s all the power you’ll ever really want, so stick with that and go about your life.

A classic example of something which is impossible to do with a context free language is something like agreement in natural language. If you want to be cheesy, it’s not too hard to write a piece of Prolog that will parse a sentence:

sentence --> noun_phrase, verb_phrase.

noun_phrase --> article, noun, !.

verb_phrase --> verb, noun_phrase, !.
verb_phrase --> verb.

article --> [the].      article --> [a].

noun --> [cat].         noun --> [cats].
noun --> [mouse].       noun --> [mice].

verb --> [eats].        verb --> [sleeps].

?- phrase(sentence, [the,cat,sleeps]).
true.

?- phrase(sentence, [the,cat,eats,the,mouse]).
true.

Of course, this permits insanity as well:

?- phrase(sentence, [the,cats,eats,the,mouse]).
true.

The cats eats the mouse? Bullshit! But it turns out, this can easily be rectified by passing along the number:

sentence --> noun_phrase(Num), verb_phrase(Num).

noun_phrase(Num) --> article, noun(Num), !.

verb_phrase(Num) --> verb(Num), noun_phrase(_), !.
verb_phrase(Num) --> verb(Num).

noun(s) --> [cat].         noun(pl) --> [cats].
noun(s) --> [mouse].       noun(pl) --> [mice].

verb(s) --> [eats].        verb(pl) --> [eat].
verb(s) --> [sleeps].      verb(pl) --> [sleep].

?- phrase(sentence, [the,cat,eats,the,mouse]).
true.

?- phrase(sentence, [the,cat,eat,the,mouse]).
false.

?- phrase(sentence, [the,cats,eat,the,mouse]).
true.

Now we can keep on taking this further. What about this nonsense?

?- phrase(sentence, [the,cats,sleep,the,mouse]).
true.

What? Sleeps doesn’t take a direct object!

verb_phrase(Num) --> verb(trans, Num), noun_phrase(_), !.
verb_phrase(Num) --> verb(intrans, Num).

verb(trans, s)   --> [eats].        verb(trans, pl)   --> [eat].
verb(intrans, s) --> [sleeps].      verb(intrans, pl) --> [sleep].

?- phrase(sentence, [the,cats,sleep,the,mouse]).
false.

?- phrase(sentence, [the,cats,sleep]).
true.

?- phrase(sentence, [the,cat,sleeps]).
true.

?- phrase(sentence, [the,cat,eats]).
false.

?- phrase(sentence, [the,cat,eats,the,mouse]).
true.

Look at this, we have a 4 line grammar that handles transitive and intransitive verbs separately and ensures number agreement. This is considered a very hard problem, and yet, here is this tidy little Prolog program that just does it. The dream of the 70s is a live in Prolog.

In the real world, this kind of program would never fly because we expect natural language processors to be fuzzy and handle the kinds of mistakes that humans make when talking and writing, and something like this is very rigid. As a result we’ve married ourselves to statistical NLP, which means huge corpuses, which means lots of memory, which means heavy programs with a lot of math behind them.

It’s strange that we live in such a bifurcated world, that on the one hand we have astronomical demands for a certain class of program, but for the kinds of programs we use day-to-day, they’re shockingly procedural. The user is expected to learn all these widgets and how to use them and what they mean (intuitive my ass) and then whenever they need to do something they just do it repeatedly. At my work one of the last things to be produced before an observation is a script that runs the telescope. The script is written in Python, but it’s completely generated by a set of objects that live in a database, which were composed, painstakingly, by a user using a point-and-click web GUI. Wouldn’t it have been more efficient to just learn to program? Or, perhaps, would it have been more efficient to build a large, complex manual NLP system like this that could parse a plain-text English specification and produce the script? We’ll never know.

Melt-your-brain debugging

The standard debugging procedure for a Smalltalk app is:

  1. Something goes awry: message not understood or something happens
  2. The app pops up a dialog with the debugger window
  3. You enter some code and hit proceed
  4. The app continues like nothing happened

How does this pan out in a web app? Similarly, only this time between 2 and 3 there is a 2.5: you VNC to the Smalltalk image running your web app on the remote server.

This sounds awesome enough, right? But apparently it gets better. You can replace that pop up a dialog” machinery with a serialize the entire current context to a log.” If you’ve ever debugged a Java web app, you are used to rooting around in the logs reading exception traces. This is like that—except, instead of reading an exception trace, you get to load the actual moment the exception was triggered into your local debugger and pick up from there as if it just happened locally! Have you ever had a problem you couldn’t debug because you just didn’t know how to reproduce the bug locally? That problem isn’t even meaningful in Smalltalk!

Cocoa doesn’t have this. C# doesn’t have this. Nothing else has this.

Unnecessary Optimism

If you’re like me, you read the foregoing commentary and thought, wow, how awesome are Plan 9/Prolog/Smalltalk! Ninety five percent of what I see is the same old fucking around with different clothes. Now you get whooshy effects when you drag n drop, but it’s still drag n drop. Now you can put multiple fingers on your browser window, but you’re still clicking on drop boxes to put in your birthday.

An even bigger letdown for me is the whole NoSQL situation. The proponents of NoSQL aren’t even aware of how they are repeating history by recreating the hierarchical databases of yore. E.F. Codd was explicitly trying to rectify these mistakes when he formulated relational database theory 40 years ago, and now we’ve come so far along that we’ve wrapped around and apparently must re-learn these mistakes a second time. The morons promoting this technology have no clue, they see themselves as the new sheriffs in town, putting their scalability law above all else.

It’s easy to forget that your forebears were once wild young bucks as well. The difference, of course, is that Codd understood math and was not blown around by the winds of popular opinion, and much time passed between the refinement of his idea and the first relational database that could perform. Since then everyone seems to have forgotten why they were a good idea in the first place: no duplication, true data integrity, and proper handling of concurrency. Buckle up, it’s going to be a rough ride.

Conclusion

Whether or not we like it, we are products of what came before. I would like to look at the recent stuff coming out of Apple with the same optimism as James Hague. There is some interesting stuff, but how hard it is for me to forget all the things I’ve seen. And I may have rose-tinted glasses; after all, I wasn’t there when it happened. Let me say this, though. If you want to blow your mind, go to your CS department’s library and look through the books printed before 1980. You’ll be surprised how much diversity, hope and optimism there was, back when our technical aesthetics were completely enslaved to the 1970s.

February 22, 2012






If you took calculus in school, you probably remember it as a blurry mismash of baffling equations, difficult algebra, many seemingly irrelevant real-world” examples and overall confusion. The blame for this is usually left squarely on the academic establishment, who are certainly at fault, but there is more to it than that.

What is calculus?

The word calculus” is actually a generic word for a mathematical system. There are lots of other calculii; computer science has lambda, pi and join calculii. So why do we call Newton’s calculus calculus” and not something more specific? Looking at history, you find that calculus” used to be referred to as infinitesimal calculus.” This is an accurate and appropriate name, because calculus is the foundation of studying rates of change and summations over time. These are things you need in Newtonian physics, because you’re interested in saying, if I have this function of distance, what is the rate of change, in other words, the velocity. Alternately, if I have a function of distance, what is the total distance?

Newton conceived this infinitesimal calculus as merely this tool he needed to solve certain general physics problems like the above. Velocity is defined by direction plus change in position over the length of time, but there could be such a thing as an instantaneous velocity; everybody with a car is familiar with this notion, it’s your speedometer. The total distance, likewise, is the sum of all of the individual distances moved, instant to instant, along this function.

The tools provided, the derivative and the integral, happen to be incredibly general tools. This leads to the first problem of calculus instruction: lack of clarity about what calculus is. Frankly, this is a lot like the confusion in Haskell with monads. They’re not a difficult concept, they’re actually quite simple. So simple, the applications are unlimited. So then you get unlimited examples, which wind up creating more confusion. Instead of teaching math, metaphor gets taught, and we pretend that distilling the commonality between six metaphors is easier than just understanding the math.

The second problem with calculus instruction is that we no longer use infinitesimals! No wonder the word has fallen out of use; how hard would it be to understand infinitesimal calculus without infinitesimals in it!

Now, Newton and Leibniz managed to make great use of calculus, but they never improved on their own informal definition of what an infinitesimal was. This kind of thing leads to great discomfort in mathematicians. Other mathematicians (Bolzano, Cauchy) found a way to skirt the issue by defining the limit formally without using infinitesimals. From there, all the rest of calculus can be rebuilt. Thanks to Karl Weierstrass, this is the formulation used by textbooks today: a formulation without actual infinitesimals, but retaining Leibniz’s notation that uses them.

It would probably be impossible to understand or use calculus without an intuitive grasp of infinitesimals, yet all the textbooks for calculus instead use the epsilon-sigma definition of the limit as their basis. This leads to absurdities like learning the formal definition of the limit after two weeks of computing them with various formulas and heuristics, which my wife is enduring now. Furthermore, the restatements of the derivative and the integral with limits are less intuitive. This should be obvious, since Newton and Leibniz started with derivatives and integrals respectively and not limits, which are completely tangental to all of the things they actually wanted to calculate. The end result is that students are expected to perform derivatives and integrals using a formalization that bears no resemblence to the intuition they’re being taught—which they need to understand what they’re doing. And the book must also include tons of real world examples” in order to get into print and into schools.

infinitesimal calculus with infinitesimals

The one fundamental problem was that Newton and Leibniz depended on an informal tool to build their formal system. The solution Bolzano found was to eliminate the use of that tool. There is another solution, found in the 20th century by Abraham Robinson, which is to formalize the concept of the infinitesimal.

Robinson defines a new set of numbers, the hyperreals. This is the real numbers plus the infinitesimal. The infinitesimal is a number greater than zero but less than any positive real. He then provides algebraic laws for manipulating the infinitesimal: it times itself is itself, it plus itself is itself, and so on. He then defines a relationship, infinite closeness” which can be essentially used to discard infinitesimals.

After learning this fairly simple and somewhat intuitive notion, you can proceed directly to taking derivatives, just like Newton did. You can then proceed to taking integrals, the inverse operation, by summing an infinite number of infinitesimally-thin strips to calculate area under a curve. In both cases, you’ve really just learned a small extension to algebra and are doing algebraic expansions to do calculus. You’ve built up the two fundamental operations of calculus without introducing the limit or the clumsy reformulation of those operations in terms of it.

How does this help me?

If you’re interested in knowing calculus, rather than just passing the class, I suggest you read the book Elementary Calculus: An infinitesimal Approach (or buy the new edition). This book follows exactly the approach. I’m going through it now myself, and I’m finding the process really enlightening. There is also research that shows that using the infinitesimal approach leads to greater comprehension of the material.

This method is called non-standard calculus” and is considered heretical by some mathematicians, notably constructivists and intuitionists. The details of their interpretation of mathematics are interesting, but not particularly relevant apart from the one detail that constructivism is related to computability, but I see Juha Ruokolainen has developed a constructivist version of nonstandard analysis (even without infinity) that addresses the complaints, and besides, they themselves are considered heretical by other mathematicians.

Moreover, learning the infinitesimal approach gives you a general problem solving tool that can extend beyond calculus. I have a hard time imagining what such a thing might look like, but I find the idea very appealing, in part because nobody has given me seventy five real world” examples, but you get the gist.

February 1, 2012 math






The very first item in Dan Ingall’s esteemed Smalltalk introduction Design Principles Behind Smalltalk” is this:

Personal Mastery: If a system is to serve the creative spirit, it must be entirely comprehensible to a single individual.

Most of the other points Dan makes in this essay are well-served by modern Smalltalks, and apart from a few glaring outliers, even by other modern systems that are not remotely connected to Smalltalk, but this one item in particular stands out to me as the most criminally absent.

In order to place it first, I would imagine Dan felt that this item is crucial. He didn’t want it lost in the shuffle in the middle of the essay. And yet, even small, modern Smalltalk implementations contain thousands of classes. That pasty abomination Morphic is by itself a great demonstration of how not to build software: one class with several hundred methods. (I think Smalltalk suffers from not showing all the methods in one file; at least in Java when a class exceeds a couple thousand lines you know there is a problem, but there’s no way to know that about a class in Smalltalk).

Earlier today I was looking at the source code for J and generally thinking about J and how odd it is that I like two languages so radically different that they almost have no similarities at all. In Smalltalk, readability is paramount; in J, readability is something that will come in time.” In Smalltalk, the goal of the code is to communicate intent; in J, it is simply to process the data. In Smalltalk, one talks about objects and their responsibilities. In J, one talks about arrays and the functions that manipulate them.

As an aside, J is object-oriented, but I have trouble imagining this faculty is heavily used, if for no other reason than to use advanced features of J one must first use J, and there isn’t a lot of that going on either.

In any case, the source code for J is famous for being written in a C that has been heavily contorted to look and act like J itself. For example, drawing a paragraph of code from a random file in the distribution, you see this kind of thing:

static A jtmemoput(J jt,I x,I y,A self,A z){A*cv,h,*hv,q;I
c,*jv,k,m,*mv,*v;
RZ(z);
c=AC(self); h=VAV(self)->h; hv=AAV(h);
q=hv[0]; mv= AV(q);
q=hv[1]; jv= AV(q);
q=hv[2]; cv=AAV(q); m=AN(q);
if(m<=2**mv){A cc,*cu=cv,jj;I i,*ju=jv,n=m,*u;
v=ptab; while(m>=*v)++v; m=*v;
RZ(jj=reshape(v2(m,2L),sc(IMIN))); jv= AV(jj);
GA(cc,BOX,m,1,0);                  cv=AAV(cc);
for(i=0,u=ju;i<n;++i,u+=2)if(IMIN!=*u){
k=HIC(x,y)%m; v=jv+2*k; while(IMIN!=*v){v+=2; if(v==jv+2*m)v=jv;}
cv[(v-jv)/2]=cu[i]; cu[i]=0; v[0]=u[0]; v[1]=u[1];
}
q=hv[1]; AC(q)=1; fa(q); AC(jj)+=c; hv[1]=jj;
q=hv[2]; AC(q)=1; fa(q); AC(cc)+=c; hv[2]=cc;
}
++*mv;
k=HIC(x,y)%m; v=jv+2*k; while(IMIN!=*v){v+=2; if(v==jv+2*m)v=jv;}
cv[(v-jv)/2]=raa(c,z); v[0]=y; v[1]=x; 
R z;
}

I saw this and I thought to myself, damn, that’s totally unreadable, but then upon reflection a couple other things occurred to me.

Users who Program, and Other Yetis

Proposition 1 of Design Principles Behind Smalltalk” is predicated on the notion that if we make programming easy enough, users will take it up. And yet, as we have seen, users do not and Smalltalk remains obscure, heavily used in a handful of industries but mostly kept alive by a medium-sized group of passionate, die-hard fans. Interestingly, APL and J are known for having a clutch of equally die-hard fans, many of whom know no other languages.:

If APL were a specialist, complex language, it would only attract the Boy Wonders” of IT, those with A-Grades in everything, whose horizons are limited by bits and bytes.

So it is paradoxical that the great majority of IT people have never really understood APL. Those who have used it successfully have very often not been computer-literate, or have only a slight knowledge … and they have frequently learned APL in isolation. That is to say, the language serves anyone prepared to explore it without prejudice.

This got me thinking. In addition to not knowing non-programmers who are interested in programming in any capacity, I also know of no users who use Unix with proficiency and have no interest in programming. And the crux of Plan 9 was to return Unix to its pure roots, where everything is a file and everything is done in the classic Unix way by stitching together small programs and leaving the heavy lifting to C programs. So it seems interesting that these systems with an emphasis on simplicity—in the name of hypothetical casual users—seem to enter obscurity without having been wildly adopted by any of these casual users.

Programmers Like To Program

It’s widely bemoaned that programmers like writing programs, even when doing so is unnecessary busy work. When Tom Lord discovered distributed version control, we didn’t just use arch, we all had to write our own DVCS, and today we live in a world with six or seven that have essentially the same featuresets. Back in the late 1990s, everybody had to write their own window manager. Now, in the early 2010s it seems like everyone is writing their own NoSQL” database.

Since programmers like to program and there weren’t really any non-programmer users of Smalltalk, it makes sense that Smalltalk got complex and forgot its roots. This is because a programmer is not all that intimidated by being presented with a huge system of astonishing complexity if we can just dive in and use it. If you have to understand everything, you’ll never write any code. So keeping things comprehensible never occurred to anyone; we were too busy writing abstractions to keep things simple for us!

The Questionable Importance of Readability

Let’s stick with J and Smalltalk even though Perl and Python would probably be better examples. Ignoring the rounding errors, say J and Smalltalk have similarly-sized user communities. Obviously there must exist a category of people for whom the initial readability is not a key concern. Indeed, looking at Haskell and Ruby web frameworks, I find that even if you have a system of aesthetics based on simplicity, your advanced users will bend them to increase power. This is why non-Yesod users balk at Yesod’s use of Template Haskell. The fact that Template Haskell is well-understood and well-supported doesn’t make it aesthetically pleasing. Snoyman can put all the work he wants into making it simple and easy and pleasant, but until Template Haskell is core-curriculum stuff it’s going to be off-putting to new users.

So looking at the J source code, it’s easy for me to hold my nose and say, that’s totally unreadable garbage; how can that be maintained? But at the same time, it’s not my place to maintain it. Imagine if it were written in the most clean, beautiful C code possible. I might be able to dupe myself into thinking I could maintain it, but it would be a lie! Is it so bad that complex projects like J have complex code? If it were a complex Java program instead, I’d still need substantial time to learn it before I would stand a chance at modifying it. Making it J-like means I am required to understand J to change the source code. Wouldn’t I have to understand J to change it anyway?

Does readability give us a false sense of maintainability?

A Thought about Power

If I were to tell you you should learn J because it is powerful, mind-bending, and high-performance, most programmers would say, yeah, maybe. But what if instead I told you, I have this language which is extremely terse, in which the output in some ways resembles and in some ways completely contradicts the input, but which is of high utility in a particular class of problems. At first it sounds like I’m trying to trick you into liking J, but I’m not—I’m describing regular expressions.

Think about it. Without batting an eye, I can write a regular expression to (say) parse an auth log line from this computer:

Nov 16 12:46:07 7gf sshd[20009]: Invalid user oracle from 200.216.162.113

A regex might look something like this:

^... \d\d \d\d:\d\d:\d\d 7gf sshd\[\d+\]: Invalid user ([^ ]+) from (.*)$

If you use regular expressions, it’s probably quite clear to you what that does, despite:

  • control sequences mixed in with literal text
  • the same character doing radically different things depending on escaping and context

What’s the difference between ^ at the beginning and ^ in brackets? What’s the difference between bracket and backslash bracket? Why do some things have to be escaped and not others? What’s the difference between + and *? These are all hurdles we have to get over to learn regular expressions, but once we know them, it’s not hard to remember how they work.

On top of all that, regular expressions do nothing to aid readability: they look nothing like English, nor do they look like instructions for how to match text. Despite this fact, nearly all of we programmers learned them, either on the job, in school or on the side, and we all use them with great frequency. We defend them, to the point that Larry Wall is possibly the first person in 30 or 40 years to suggest we could actually have a better syntax with which to do this. I know of just one program that matches text in a radically different fashion (I’m talking about txr) and even it lets you fall back on them.

Let’s consider a Smalltalk-like API for defining that same regular expression. Would it look like this?

| captures |
captures := TextPattern 
atStart;
skip: 3;
space;
digits: 2;
space;
digits: 2;
special: ':';
digits: 2;
special: ':';
digits: 2;
space;
literal: '7gf sshd';
oneOrMore: (TextPattern digits);
special: ':';
literal: 'Invalid user ';
capture: (TextPattern oneOrMore: (TextPattern space not));
literal: ' from ';
capture: (TextPattern zeroOrMore: (TextPattern any));
endOfLine.

Does that seem like an improvement? Maybe if you don’t know regular expressions. But you do. This is the crux of the APL/J argument. Where’s the flaw?

Afternote

I just ran into this document about the proceedings of the APL 2010 conference discussing OOP. In it you see some real gems, like:

I tried to learn SmallTalk once, but it was not a success.

and:

J does not have instances [in the OOP sense] properly speaking. They are locales and locales are not assignable (like in APL) so you cannot have arrays of instances or classes (which are also locales).

Interestingly, there seems to be some truck with Haskell users. I have seen mention of Haskell both on this page and on the J Programming Forum, but not really of other FP languages.

Tentative Conclusions

  1. It is more honest to see programming paradigms, development tools and language diversity as an effect caused by programmers need to program, rather than technical or market needs.
  2. Technical decisions promoted in the name of programming users” are lies.
  3. Readability as a concept may be a lie.

Addendum: Lack of Visi.ion

I just saw this Visi.io project linked from Reddit. I want to say it’s admirable what this guy is doing, and he has quite a pile of code already written. But even his version of history is quite damning: HyperCard is dead and Excel is rarely understood. Taken with the preponderence of other evidence, this project will likely go the same way as all the other well-intentioned user-liberation schemes. We’ve been trying to make programming accessible to non-programmers for the last fifty years. Has it worked? Are the missing ingredients really type inference, JSON and monoids? It wasn’t that long ago that HTML was invented specifically to be readable to humans and machines, and here he is dismissing it because it sucks compared to native apps. Well, spreadsheets also suck compared to native apps, and so did Hypercard stacks. Is the problem really that those tools lacked a sufficiently expressive language, and not that users would rather have higher-quality tools that do not require programming?

November 17, 2011