Friday, December 26, 2003

We likes our TrackPoint caps, yesss

The IBM TrackPoint is by far the best integrated pointing device available for notebook computers. It completely blows away trackballs and touchpads (or, at least, the paltry miniaturized incarnations of the latter that are integrated into notebook computers). The absence of TrackPoints from Apple's notebook line may be the single most compelling reason not to get a PowerBook or iBook.

However, TrackPoints do have one minor deficiency: the caps wear out. After a little over two years of heavy use, my index finger had finally worn down the bristly coating on the last spare cap that came with my notebook. A little Googling led me to IBM's TrackPoint replacement page, which reveals that the traditional "Classic Dome" style --- shaped like a pencil eraser and coated with a bristly "cat's-tongue" texture --- is no longer the only option. (If you've drooled with envy over a colleague or friend's shiny new ThinkPad lately, then you've probably seen the "Soft Dome" variety, which I believe now ships standard on all ThinkPads.)

Anyway, the alleged "mechanical advantage" of the "Soft Rim" cap sounded cool to me, so I ordered a batch from IBM's suggested supplier. My report: Soft Rim TrackPoint caps rule. The cap really does feel more responsive because of the shape. Furthermore, the absence of textured coating has two benefits. First, there are no prickly bristles to irritate the pad of your finger. Second, Soft Rim caps don't have the Classic Dome's problem of wearing out their coating in less than a year of heavy use. This wear problem was the reason I had to replace the caps in the first place. Since the Soft Rim style uses shape instead of texture to provide finger traction, it seems likely that this cap will last significantly longer than the Classic Dome caps.

Now, if they only came in more colors... Apple, are you listening?

P.S. IBM's USER lab has been exploring some interesting variations on the TrackPoint theme.

Microsoft Word is a terrible program

L. Menand "speaks the truth to power":

Microsoft Word is a terrible program. Its terribleness is of a piece with the terribleness of Windows generally, a system so overloaded with icons, menus, buttons, and incomprehensible Help windows that performing almost any function means entering a treacherous wilderness of pop-ups posing alternatives of terrifying starkness: Accept/Decline/Cancel; Logoff/Shut Down/Restart; and the mysterious Do Not Show This Warning Again. You often feel that you’re not ready to make a decision so unalterable; but when you try to make the window go away your machine emits an angry beep. You double-click. You triple-click. Beep beep beep beep beep. You are being held for a fool by a chip.

Like all humanities geeks, Louis should really learn to use LaTeX and Xemacs. I learned to use Xemacs as a CS undergrad, and I learned LaTeX during my first year of grad school. At first it was kind of odd but now I actually find it astounding that any human being can stand to edit large quantities of text using Word. Blech. And LaTeX has excellent citation support.

Saturday, December 13, 2003

The long shadow of the Yankee Puritan tradition (and others)

Kos guest DHinMI points to a fascinating idea by D. H. Fischer:

In 1989 the great American historian David Hackett Fischer published Albion's Seed: Four British Folkways in America. Fischer shows that in just about everything, from home design to what we eat for holiday dinners, from the names we give our children to the ways we pronounce our words and experience time, marriage, aging and death, there are discernable continuities between certain regions of seventeenth and eighteenth century Britain and certain regions of the contemporary U.S.. The Puritans who settled New England, the Anglican Royalist elite and their servants who settled Virginia, the Quakers from Wales and the Midlands who settled the Delaware Valley, and the poor English-speakers from Scotland, Northern Ireland and the north of England who settled the backwoods south each brought with them very different conceptions of liberty; as Fischer writes, "the problem of liberty cannot be discussed intelligently without a discrimination of the libertarianisms which must be made in historical terms." Fischer attributes American regionalism largely to the continuing influence of these four British migrations, and the religious, ethical and political societal norms their descendents have carried with them across the continent.

We can see this effect by looking at upstate NY, the Great Lakes states, and western WA and OR, all of which were initially settled largely by Yankee descendents of the Puritans. The Puritans established communities with greater civic participation, tolerance of or desire for government intervention in business and community life and adherence to the law, and lower rates of violent crime than those founded in the regions settled by the other British immigrant groups. This pattern has held for over three hundred years, even as new immigrant groups merged with the Yankees in New England and as the Yankees settled the areas to the west. The other side of this phenomenon is that later immigrant groups have conformed to the norms of the regions in which they have settled; Germans who immigrated to the Great Lakes states largely adopted the communal folkways of the Yankees, but Germans who immigrated to Texas were likely to adopt the more individualistic and anti-authoritarian folkways that originated on the frontier of eighteenth-century Britain.

Ransom Love has a cooler name than you.

When I was an undergrad, back around 1999 or so, I attended a Linux evangelism session in New York, run by a little company I'd never heard of, called Caldera. One of the speakers demo'd a new product called Caldera OpenLinux on a Sony Vaio laptop. During the presentation, he simply stuck an OpenLinux CD into his laptop, and a few button-presses later the laptop rebooted into a slick graphical installation program. He selected some packages and pushed the installation button. It was easy. It had pictures and icons and all the usual trappings of a graphical interface. It had a Tetris game to play while you waited for the operating system to finish installing. A little while later, the process was complete, and he brought up the KDE 1 desktop.

It was astounding: you could get this entire operating system, which looked quite comparable to Windows or MacOS, which were selling for hundreds of dollars at the university computer store, for free (or, if you wanted a box and manuals, for forty dollars or so). And you could get all the source code, not only for the kernel but for nearly all the applications. And it came with programming tools, like an industrial-strength compiler, right out of the box --- a major plus for computer science students. And, thanks to Caldera's work in packaging it all, it was accessible to people who'd never run Unix before.

My friend PSP and I snagged a bunch of free OpenLinux CDs on our way out. A few months later I installed Linux on my new machine --- the first one I ever assembled from parts --- and no computer of mine has been without Linux since. (I'm posting this from a laptop running Fedora.) Since then, other vendors like Red Hat and SuSE have surpassed Caldera, but nevertheless Caldera provided my first introduction to a Linux that I could install and use.

The evangelist's name was Ransom Love, and he was the CEO of Caldera Systems, Inc. He was sharply-dressed, with a real, styled haircut, and his personality exuded an odd mixture of geek and suit. He was charismatic and cool and at least acted like he was genuinely excited to be showing off Linux for a bunch of undergrad computer science geeks.

In the years since those heady days, the tech bubble burst, Caldera bought/merged with SCO, and the resulting monster has become a company that will live in infamy. It's sad.

Anyway, Love departed well before the current SCO/IBM debacle began. He's now at Progeny, the Linux company founded by Debian creator Iain Murdock (hence probably still on the side of Good). And Love still has some interesting stuff to say about his former company and its relationship with IBM.

UPDATE: For further fascinating reading, check out this ZDNet article on the history of Caldera: Love and his associates were working at Novell, trying to convince upper management that it had to move to Linux, way back in 1994. First of all, this demonstrates astounding foresight, if you think back where Linux was in 1994. Second, it's a bit eerie/ironic, because if you flash forward a decade, Novell's bought both SuSE and Ximian, two of the Linux community's leading lights; and they've finally admitted that Linux is the best bet for the company's future.

Thursday, December 11, 2003

Making a mix tape for someone special?

Don't force the recipient to ferret out obscure subtexts. Make it simple.

A good monospace font is hard to find

Anonymous (2001) is a TrueType version of Anonymous 9, a freeware Macintosh bitmap font developed in the mid-90s by Susan Lesch and David Lamkins. It was designed as a more legible alternative to Monaco, the mono-spaced Macintosh system font.

I still swear by Lucida Console, but this isn't bad. If you're a programmer, you probably spend several hours a day staring at monospace fonts. Give Anonymous a try.

I'm glad we cleared that up.

From the ATF's FAQ:

(A29) Are "potato guns" or "spud guns" legal?

"Potato guns" or "spud guns" generally consist of sections of PVC plastic tubing and fittings and are designed to launch a muzzle-loaded potato (or other similar-size projectile) using hair spray or other aerosol vapor as a propellant. The propellant is ignited by means of a barbecue grill igniter or other similar ignition system.

...

ATF has previously examined "potato guns" or "spud guns" as described above and has generally determined that such devices using potatoes as projectiles and used solely for recreational purposes are not weapons and do not meet the definition of "firearm" or "destructive device" in either the NFA or GCA. However, ATF has classified such devices as "firearms" and "destructive devices" if their design, construction, ammunition, actual use, or intended use indicate that they are weapons. For example, ATF has classified such devices as "firearms" and "destructive devices" if they are designed and used to expel flaming tennis balls.

Dammit! Now I have to dismantle in my flaming tennis ball gun!

Sunday, November 30, 2003

Deconstructing Wilco

The song "Heavy Metal Drummer" on Wilco's Yankee Hotel Foxtrot has the following lyric, which, if you're moderately familiar with deconstruction, will just about make your head explode:

I miss the innocence I've known
playing Kiss covers
beautiful and stoned

The speaker's putative pre-fallen "innocence" reveals itself in the very same line as something that is "known". Just as the fruit of knowledge of good and evil (which led to the archetypal ex-Edenic Fall) itself grew in the Garden of Eden, the innocence of the speaker itself harbors the decidedly fallen (influence-polluted or "knowing") act of "playing Kiss covers". The artist's original voice of innocence is always already fallen, in the fully Derridean sense.

But the really striking thing about this passage is that the speaker's conception of innocence strikes us as completely banal. A century and a half ago, Wordsworth and the Romantics sought out their original voices in pristine nature, seeking a sublimity that transcended the merely human. In their own, arguably more complicated way, so did Emerson and the Transcendentalists. Of course, however majestic its artistic achievements, the project was doomed --- when we seek out transcendence, we are chasing an inevitably human idea (deconstruction does get that much right). As Wilde famously remarked, Wordsworth "found in stones the sermons he had already hidden there". But nevertheless the binary opposition between nature and artifice persisted as a powerful feature of consensus reality well into the twentieth century.

Denying (or deconstructing) that opposition once had a certain noteworthy frisson. Indeed, the pulsating ebb and flow of a character's illusory escape from (and reintroduction into) social constructions of identity is the engine that powers much great High Modernist fiction, from Portrait of the Artist as a Young Man to Invisible Man.

No more. [*] The contemporary construction of innocence no longer even bothers to wear the fig leaf of freedom from a priori influence. Innocence is knowingness, so much so that we don't even find it remarkable when people conceive of youthful innocence in terms of playing Kiss covers.

(Unless, that is, you're a former English major who occasionally falls off the wagon of literary theory abstinence and compulsively overanalyzes some random pop cultural artifact.)

In any case, Wilco's album kicks ass. And no, I am not stoned (or beautiful).

[*] This is a placeholder to indicate the spot where, if I were a real journalist, I'd probably insert a formulaic Matrix reference to give my article more editor-pleasing topicality.

As a member of the middle class...

Paul Ford on class and money:

As a member of the middle class I buy things. But the rich do things with their money. I've watched wealthy men and women turn a million dollars into respect, a partnership, a new business. They convert their funds into opportunities and relationships, translate cash into power, amplify their ideas into businesses, summer cottages, and tax shelters.

...

I've tried to learn the language of money, to fake the speech of the financiers, but I've had to accept that money is simply not a medium in which I can work. I am not a native speaker, and my middle-class accent betrays my ignorance.

(Via Everything Burns.)

UPDATE: Ah, what the hell, read this too.

Saturday, November 29, 2003

Ah, misspent youth...

Via MeFi comes an incredible 11-minute complete playthrough of Super Mario Bros. 3 (18MB Windows Media file).

I was wondering why watching this movie's so satisfying. I've reached the conclusion that it's rather like knocking back your first double espresso after a lifetime of sipping Coca-Cola.

Here's an interesting question: why is playing a Super Mario Bros. 3 so much more fun than, say, balancing your checkbook in a spreadsheet, which also involves manipulating little electronic symbols on a screen with a sequence of button presses? It's because SMB3, like all great games (video or otherwise), delivers a constant stream of little sensory rewards to the user for completing certain actions --- the jingle that plays when you earn a 1UP, for example, or the "plick!" sound (accompanied by the visual kick of seeing Mario double-jump) when you step on a Koopa, or the resounding "ding-ding-ding-ding" of grabbing a line of coins. The simple aesthetic pleasure that you experience when you receive these rewards make the game rather like a drug: each little victory is like a "hit" of the drug that keeps you playing in order to score another one. Which is why the ultimate compliment one can bestow on a great game is to call it "addictive".

When you string together a bunch of these victories --- making Mario leap through a sea of coins, bounce off a string of enemies, and then fly into the sky on his raccoon tail --- the "hit" becomes a sustained high, the dime-store cousin of watching a virtuoso performance in music or professional sports. Watching someone play through an entire game flawlessly is like mainlining a clean pure gram of the drug that you've only tasted previously in watered-down and adulterated form. (The experience is only slightly diminished in this case by the fact that this movie was produced by playing back a recorded series of button-presses on a SNES emulator.)

As I've said before, I've basically quit playing video and computer games; but this movie brought back fond memories.

UPDATE: As I was writing the above, I had a dim memory percolating around my brain about something I'd read that discussed reward structures in games, but I couldn't in it down. I've finally tracked down the reference: see Daniel Cook's thoughts on "reward schedules" and John Hopson's article on evolutionary game design.

Tuesday, November 25, 2003

Vollman completes mammoth meditation on violence

W. T. Vollman's Rising Up and Rising Down is finally available for ordering from McSweeney's. An excerpt was published in McSweeney's No. 9, and it was totally compelling reading: a record of Vollman's visits to the Paris catacombs and a Chicago autopsy room, and his correspondence with a friend in Sarajevo. For me, the prose in that piece has the rare and magical quality of being simultaneously discursive and magnetic; it tugs me along even as it spirals and loops unpredictably around its unbearable subject like a plot of the Lorenz attractor.

Rising Up and Rising Down stands a long way from the snarky po-mo pop-culture in-jokes on McSweeney's Internet Tendency. Vollman's book is a serious piece of work; and given its seven volumes (weighing 20 lbs. in total), and its undoubtedly miniscule audience, it's a nearly heroic effort on the part of the McSwys crew to publish it.

I'll definitely be ordering a set, however thinly it stretches my shabby grad student finances. That is, if the limited print run hasn't sold through by tomorrow.

Academics, children, gender

Two posts at Crooked Timber worth reading: on the childlessness of women vs. men academics; on the increasing duration of the professorial track in recent decades.

Off-the-cuff reaction #1: If you're a woman academic who wants to have children and get on the tenure track at a competitive research university, you should look for a man who's likely to stay home and take care of the kids. With all that entails. Unfortunately, most women's libidos are wired by culture to respond most strongly to men who are at least somewhat aggressive, ambitious, and dominating. Alas, these traits do not correlate strongly with the tendency to become a stay-at-home husband.

(Off-the-cuff reaction #1.5: Perhaps women who find their libidos wired in this tragic fashion could resolve this conundrum the way that men have traditionally resolved the converse conundrum. Let the sensitive house-husband raise the kids; cheat on him periodically with aggressive, dominating men. In case the "converse conundrum" for men is not obvious: most men are wired, by culture, to respond to perky taut-bodied young women who hang on their every word, which does not usually describe a wife after she's had a few children. The traditional male solution has been to cheat on the wife with a younger woman. I leave it to the reader to puzzle out whether I am seriously suggesting that women academics cheat on their model-father husbands, or whether I am suggesting that women academics should reconcile their libidos with their circumstances. I am not sure which of these suggestions is more offensive, but they seem like the obvious alternatives. Of course, the statistics tell us that most women academics are not having children, and hence probably not married to homemaker Dads, so this hypothetical situation isn't exactly representative of reality anyway.)

Off-the-cuff reaction #2: The inescapable conclusion of the second CT post is that, because of the asymmetry in male vs. female fertility, the current academic system is structurally biased towards men. The 22-34 window, during which you have to get your Ph.D. and then immediately work your ass off to get tenure, overlaps with most of a woman's prime childbearing years. This sounds utterly obvious but it has never before hit me with the same force.

We should move to a system in which junior faculty can take parental leave for a few years, sometime before the tenure review. Or, we should remove the stigma attached to getting your Ph.D. in your 30's instead of your 20's; and make it easier for grad students to take a few years off from the program after quals. Or all of the above. These reforms would have major benefits for men as well: not everyone fits into the classic 12-year career path anyway.

UPDATE: Inky points to a Nature article about a European Commission conference whose subject is promoting women in science.

Wednesday, November 19, 2003

Puget Sound Orcas in decline

Some activists are suing to get them placed on the endangered species list; the activists' science sounds dodgy ("The [activist] groups ... say the Puget Sound orcas are genetically distinct from other killer whales and therefore should be on the list.") but it would nevertheless be sad if our local pods were to disappear:

The Puget Sound whales include three pods of orcas and about 84 individual whales. That is down from more than 120 in the 1960's, before they were captured in large numbers for display at marine parks.

Monday, November 17, 2003

Hot or not, watch out

Here comes Rate My Kitten. It sounds so dirty, and yet it's so very very clean and wholesome! I can't stand it! I think my head is going to implode!

Sunday, November 16, 2003

Intolerable Cruelty

Saw Intolerable Cruelty last night with SL. I can't figure out why the reviews have been so mixed; I think it's the Coens' second funniest movie (after, of course, the insuperable Big Lebowski), and probably the funniest movie I've seen, period, since Adaptation.

The Coens have toned down their distinctive style quite a bit, and yet this movie's getting mediocre reviews from the same critics who used to complain about their allegedly stupid stylistic tics. How conventionally does a movie have to be shot before it makes these people happy?

And I don't understand --- indeed, I have never understood --- the chorus of perennial complaints that the Coens indulge in too much reflexive irony, i.e. they ask us to laugh at their characters, rather than with them. Maybe it's a generational thing? To me (and, I suspect, virtually everyone who's roughly my age --- I've literally never spoken to anyone in their 20's who doesn't like Lebowski), there's no bright shining line between irony and sincerity; the two bleed into each other like yin and yang. So what if you're giggling throughout Miles Massey's Faustian confrontation with his boss Herb Myerson? So what if you guffaw when Walter flubs the disposal of Donny's ashes? Does the laughter really subtract from the pathos, or does it sharpen it?

Plus, comparisons to old classics like The Lady Eve and Trouble in Paradise remind me of those Baby Boomers who are perpetually going on about how Bob Dylan's the greatest bard of the century and all the rock music today just can't ever measure up. Hey, guys and gals, maybe there's actually something a little different going on here? Granted, not completely different, but at least something worth judging on its own terms?

And, finally: George Clooney is awesome. Can any other living actor so effectively combine leading-man charisma with dead-on comic timing and an unwavering willingness to be vulnerable or silly or just plain dumb? If he were a little less incredible, straight men the world over would have to hate him for being so perfect. As it is, I can only admire him for being even better than perfect.

Saturday, November 15, 2003

Best of this month's Crypto-Gram

It's the 15th again, and that means you know what that means, kiddies: a new Crypto-Gram. My favorite links:

  • How to find hidden cameras (258K PDF). If you venture anywhere in public, or even private spaces that are owned by businesses,, you're probably in the sights of one of these cameras. The paper's got lots of fascinating details on camera design, common hiding places, and even countermeasures (most of which, alas, are beyond the budget and expertise of the casual citizen). Stuff like this makes me seriously want to buy a portable EMP device, walk into a Wal-Mart carrying it in my backpack, pulse the place, and walk out.
  • ID numbers: it turns out that your driver's license number may contain a bunch of encoded information about you. They don't just assign them consecutively or randomly. Also some information on credit card numbers, social security numbers, etc.
  • GrokLaw on security of Microsoft vs. Linux.

Teflon Considered Teratogenic?

Time to buy a cast-iron skillet, maybe? Caveats:

  • The EPA is still reviewing the subject and has not issued any warnings about Teflon products.
  • It's not clear that the reported birth defects are statistically significant.
  • The 20/20 transcript is maddeningly vague on the exact levels of C-8 found in the blood of humans who use Teflon. Is it one tenth of carcinogenic levels? One millionth? Nor are they very clear on the differences in exposure between DuPont factory workers and ordinary people who cook with Teflon pans or wear Gore-Tex.

On the other hand, the "Teflon flu" is definitely real, by DuPont's own admission. Don't leave your Teflon pans on the range too long. (Once again, the 20/20 transcript is infuratingly vague on this subject: exactly how long, on a typical home range, do you have to heat a pan before it reaches the 554-degree point where particulates come off the Teflon?)

I'll be keeping my eyes open for the EPA's report. Most of my cookware is Teflon-coated.

Thursday, November 13, 2003

"Think I'm in love/Probably just hungry"

L. Helmuth reports for ScienceNOW on two fascinating presentations from the November 11 meeting of the Society for Neuroscience. ScienceNOW requires a non-free subscription but some excerpts follow.

First up: A recent study by H. Fisher (of Rutgers), A. Aron, D. Mashek, G. Strong, and L. L. Brown provides some insights into the neurochemistry of love; from Helmuth's article (emphases mine):

College students participating in the study of romance had been with their One True Love for between 2 to 17 months and they displayed all the classic, feverish, delusional symptoms: obsessive thinking about their partners, sleeplessness, euphoria when things are going well. ...

These lovebirds --- seven men and 10 women --- then went into a functional magnetic resonance imaging scanner ... Regions of the brain involved in the motivation and reward system lit up in response to the loved one, including parts of the caudate nucleus and the ventral tegmental area. ...

These results differ from those of a previous study ... which imaged the brains of people who'd been in relationships for more than 2 years, on average, and found lots of activity in emotional areas such as the insula and anterior cingulate. Fisher's team reexamined their data and found that the subjects in relatively longer-term relationships also activated these emotion centers when viewing their loved ones.

So --- watch my cynicism to swing into action --- the first flush of infatuation, with butterflies in your stomach and palpitations in your heart, has much more in common with basic physical urges like hunger or arousal than with genuine emotions. Spiritualized had it right all along. For infatuation to become an emotion, rather than merely an urge, you have to wait for at least a few months. Maybe a year or more (hard to tell from the articles and the abstract; this work has not yet been published as a complete, peer-reviewed paper).

Furthermore --- putting on my mad scientist hat --- maybe this research points to a solution to the problem of diminishing passion in long-term relationships: we just need a pill that stimulates the caudate nucleus and the ventral tegmental area. Whatever the hell those are. I suggest we call it Caudela---

Not enough spice in your marriage? Spouse doesn't send shivers down your spine anymore? Does your heart not burn with longing every moment that you are apart? Do you no longer feel that pulsating stream of joy every day that you go to bed and wake up next to each other?

Ask your doctor about Caudela™

DISCLAIMERS: Use only as directed. Caudela™ may not be suitable for all patients. Side effects may include insomnia, shortness of breath, cardiac arrhythmia, loss of balance, mood swings, stupidity, and selective blindness. Desperate singles and ovulating women should exercise extreme care when using Caudela™. Severe withdrawal symptoms have been observed from discontinued use. Some patients require psychological counseling when beginning or ending treatment. Older subjects, on the other hand, may experience sensations of relief when Caudela™ treatment ends and they can settle down to their saner, duller lives.

In other news, Helmuth reports on a study on orgasm in the fairer sex:

A brain-imaging study shows that, during orgasm, women's brains have about the same pattern of activity as men's. ... Compared to clitoral stimulation alone, orgasm caused greater activation in several parts of the brain, including the same reward region tickled by romantic love, the ventral tegmental area. The main difference between the sexes was a deep brain area called the periaqueductal gray. It's also the sine qua non of the female sexual response in cats, rats, and hamsters; if it's damaged, the animals don't assume a mating position. Other than that, the brain activity "is very much the same as during ejaculation in males," says Holstege.

Most men have, at one time or another, suspected that orgasm might somehow be better for women. Now we know that, well, it's basically the same. I don't know whether to be relieved or disappointed.

Anyway, this stuff is all cool. Sometimes I wish I'd been a neurologist. Alas, that I have but one life to give.

Alternate links for the Fisher et al. study on the neurology of infatuation:

Tuesday, November 11, 2003

"Content" may go a progress through the guts of a worm...

...but otherwise it is none-too-kinglike. A couple of years ago it was fashionable in "new media" circles to proclaim that "Content is King" --- i.e., that the Internet would be dominated by big media companies that owned the music, movies, and other stuff that people wanted to download.

More recently, people like Jack Valenti are going up in front of Congress and making noise about how piracy is stifling the growth of broadband. The argument goes like this: broadband can only grow if there's demand; only Big Media (music, movies) can produce that demand; Big Media won't provide Content unless they can be sure it won't be pirated. Valenti and his cohort would therefore have us all hand control of our computers and networks over to Big Media. Or, in other words: "Kill the Internet to save broadband!"

The lynchpin of this whole argument is, again, that Content is King.

Well, let AT&T Labs researcher A. Odlyzko disabuse you of these notions. Interpersonal communication, not broadcast, has always been the primary bandwidth consumer of communication networks, even before the Internet. When given networks that are capable of point-to-point communication, people want to communicate with other people, far more than they want to passively consume content. AT&T's old ad campaign had it right: what people really want is to "reach out and touch someone."

Sunday, November 09, 2003

Monday, November 03, 2003

Misery is happiness. Ignorance is bliss.

Spent last week at a research conference in smoky LA with my research group. One night, after SL and I had a few drinks at the hotel bar, we ended up talking once again about the nature of happiness. He was claiming that we make all our choices because we believe those choices will make us happy; i.e., that happiness is the ultimate value. Therefore, when we make sacrifices that make us unhappy in one way, we're really doing it because it makes us happier in some other way. Sounds reasonable enough, but my friends who read this blog already know that I disagree completely.

We arrived at this subject while talking about ambition and why we're bothering to finish our Ph.D.'s. Suppose you're an ambitious person who believes that a person's career reflects strongly on his or her value as a human being. You've got a "successful" job, which brings you prestige and intellectual satisfaction; but that job takes a lot of your time, and getting that job entailed sacrifices --- like going to grad school for six years instead of getting a job in industry, making lots of money, and finding a serious girlfriend. Suppose you also know that you might get more pleasure out of life with a much less demanding career: you could take longer vacations; you could spend a lot more time with your future kids; etc. In this scenario, SL would say that, if you choose a more prestigious career over a more hedonic life, you're making that decision because it makes you more truly happy.

That's fine, as far as it goes, and in particular cases it might be true. But SL made the further (and in my opinion erroneous) claim that this example demonstrates a larger principle: Everything we do, we do for happiness. So, for example, if our principles lead us to make great sacrifices that ultimately make us miserable, we're still taking those actions because on some deeper level, it's really making us happy.

I think this claim imples a rather odd definition of happiness. I define happiness as a sensation of joy and contentment --- a definition that clearly distinguishes happiness from non-happiness, and also matches most people's casual intuition. SL's definition has neither of these virtues: it's both tautological (happiness is whatever you choose, because you choose that which (you hope) makes you happy) and counterintuitive. For example, if some dim and dusty corner of your conscience knows that you're doing "the right thing" with your life, but you're nevertheless miserable in your day-to-day experience --- if you're so emotionally tormented that you wake each morning with a palpable lancing pain shooting through your heart, and that pain doesn't go away until you fall asleep at night --- then I would not say that you are "happy" by any usual definition of the word. Yet people knowingly make choices that lead them into this position.

So, to convince SL that he was wrong, I presented him with a dilemma, in the spirit of my three thought experiments: Suppose I could offer you foolproof brain surgery that would make you perfectly content to sit in a corner drooling for the rest of your life. Would you accept this surgery?

"No," he said, "but only because you could never convince me that it would work. What if you make a mistake?"

I said: "Okay, a million people have received this surgery, and every single one of them has reported absolute bliss. Once a month they wake up from their drooling stupor and say, 'Man, I feel so fucking happy! Having the surgery was the greatest decision I ever made.' And then they start drooling again."

He smiled and thought for a moment, then said: "Well, then I'd say yes."

"Most people would not make that choice," I said. "And anyway, you're lying. I don't believe you'd make that choice, if I really offered it to you."

Ultimately, he agreed that he wouldn't accept if I really made the offer. He claimed this didn't contradict his framework, because accepting would make him extremely unhappy in the present. Hence, even though the unhappiness would be short-lived, the intensity of that unhappiness would be so great that it would outweigh the lifetime of happiness that awaited him.

At this point we finally reached our hotel room, so the conversation ended. (We were walking from the elevator --- and if you think this is a long conversation to have whilst walking from the elevator to the hotel room, you haven't been to the Anaheim Hilton.) However, if we'd had more time, I would have said that I wasn't satisfied by his answer. I think his explanation --- that present unhappiness counterbalances future happiness --- does not suffice to explain his rejection. Suppose I could arbitrarily increase both your longevity and the intensity of your happiness --- suppose it would be a continuous, eternal orgasm of bliss. There's only so much unhappiness that the human form is capable of experiencing in a finite time period. If you're really being honest with yourself, there's no way your momentary unhappiness prior to the operation would exceed a million-year orgasm.

So, the notion that happiness is the ultimate value requires convoluted reasoning, and it leads to some suspicious conclusions. Isn't it simpler and more elegant to say that happiness is one among many values, and that we choose among those values based on the form of our characters?

In my opinion, the happiness-maximization doctrine reeks of the fallacy of the Rational Human Being (a close cousin of Economic Man): by making happiness the ultimate value, we can pretend that all our actions stem from rational maximization of the Happiness Utility Function. This gives us the comforting illusion that we're sensible people in control of our destinies.

I don't believe we're rational actors at all. In fact, I don't believe we're even really decision-making entities in the form that people usually assume. I think that our brains long ago evolved to provide so much surplus capacity that mind viruses have hijacked the extra space, in much the same way that a rainforest provides so much surplus biomass and energy that it's home to billions of competing species besides the trees themselves. These mind viruses take many names --- aesthetics, religiosity, curiosity, etc. --- and when you make a decision, it's usually only because one of these viruses has momentarily prevailed over the others.

Your brain is not an organism; it's an ecosystem. Does a rainforest rationally act to optimize its proportion of red ants versus black ants? No; it's just a battlefield where sometimes the red ants win, and sometimes the black ants win. Searching for a motivation behind the victories is meaningless. Ask, instead, for the reason.

Your character has a shape. Your life is the trajectory that this shape carves through the ether of the world, just as a leaf and a rock will carve different trajectories through the air depending on their shapes. Conscience or ambition or any number of things may lead your path away from happiness. Is this a tragedy? Not necessarily. It depends on what you value.

Monoculturalism replacing racism?

Interesting comment from one of M. Yglesias's comments threads:

I'm from the South, and I don't agree that racism is the only thing behind the rise of the GOP and the decline of the Democrats. Racism is on the decline in the South, I believe. What is replacing racism is something that I'd called "monoculturalism" (an invented antonym of "multiculturalism"). Souherners (like people everywhere) want there to be cultural constants that they can assume everyone shares: constants such as belief in God, two parents (one of each sex), love of football, pride in Southern heritage, love of barbeque, etc. Southerners are uncomfortable around people who are too different---such as Muslims, Hindus, homosexuals or vegetarians. Racial diversity is not very important anymore, as long as people of different races can learn to act the same and support the same football teams.

Question: Is this actually an improvement?

Sunday, October 26, 2003

The Happiness Paradox

My friends know that I've been preoccupied lately with the nature of happiness. This morning, my Powell's Review-A-Day subscription brought me a review of The Happiness Paradox by Ziyad Marar. The meat of the argument:

Underlying the question of whether I am happy are two more fundamental questions: "what do I really want?" and "how ought I to live?". Marar takes the first of these to be about freedom; the second has to do with morality and what Marar regards as the basis of morality, namely "justification" -- ie, approval, trust and love.

The paradox of happiness is that we want both freedom and justification, but the freer we become the less we depend on the approval of others, and the more we want the approval of others the less free we become. This would suggest that freedom and justification are mutually exclusive, and that happiness is elusive precisely because it poses this apparently intractable dilemma. ...[deletia]... the desire for happiness consists in the desire for freedom and for approval and this is why happiness is paradoxical. We can live with this paradox only by having the courage both to face the judgement of the audience and the courage to be independent of its judgement and risk humiliation. But courage does not make us happy: it is merely the motor which keeps us going. Happiness is not a goal, but a process: "It's a retreat from security and an advance towards risk, while being a retreat from risk and an advance towards security -- a perpetual oscillation".

All interesting enough; and superficially similar to stuff that I've been thinking and writing recently.* However, I'll be completely unfair to Marar (by rushing to judgment without reading the book) and say that I find this explanation deeply unsatisfying, for the same reason I find nearly all philosophy unsatisfying: namely, it's extraordinarily arbitrary.

OK, sure, we want freedom, and we want justification, and there's a conflict between these urges, but why should this conflict be the key to the nature of happiness? Why should the central problem be the conflict between those two particular urges, and not, say, the desire for sex versus the desire to avoid death? Why not the desire for pleasure versus the desire for power? Why not the desire to have a slice of cheesecake versus the desire to lose weight? I can pick nearly any two desires and observe that they come into conflict. What makes one desire more fundamental than any other?

If you really want to know about the nature of happiness, neurology is a far better place to look than philosophy. Why is happiness elusive? The answer's terribly simple: happiness is a brain state caused by certain stimuli; our brains are built out of neurons; and neurons are learning devices that learn to become accustomed to any given stimulus. We therefore become acclimated to feelings of approval or independence or anything else. If we hacked our neurons not to become acclimated, we might very well be entirely satisfied with our current level of approval, and independence, and everything else too. (We'd probably also be extraordinarily lazy.)

One might claim that, compared to the plain truths uncovered by neurology, Marar's explanation has more respect for the mysteries and confusions of the human condition; but I think this claim is basically a form of masturbatory, pseudointellectual mystification. Keats was wrong; unweaving the rainbow doesn't destroy its charms, unless you're so small-minded that the only charm you can appreciate is the veil of your own ignorance.

And, as for courage:

Daniel Gilbert, Professor of psychology at Harvard, calls the gap between what we predict and what we ultimately experience the "impact bias" --- "impact" meaning the errors we make in estimating both the intensity and duration of our emotions and "bias" our tendency to err. The phrase characterizes how we experience the dimming excitement over ... any object or event that we presume will make us happy. ... Gilbert has noted that these mistakes of expectation can lead directly to mistakes in choosing what we think will give us pleasure. He calls this "miswanting."

... [deletia] ...

[George Leowenstein] goes on to describe the "empathy gap", the difference between how we behave in "hot" states (those of anxiety, courage, fear, drug craving, sexual excitation and the like) and "cold" states of rational calm. This empathy gap in thought and behavior --- we cannot seem to predict how we will behave in a hot state when we are in a cold state..."These kinds of states have the ability to change us so profoundly that we're more different from ourselves in different states than we are from another person." ... He also adds that a better understanding of the empathy gap --- those hot and cold states we all find ourselves in on frequent occasions --- could save people from making regrettable decisions in moments of courage or craving.

This doesn't mean we ought to ignore every moment of courage or craving, but to me this sort of empirical research says far more about the real nature of courage than most philosophy (and certainly more than the gloss on Marar's philosophy rendered in the review quoted above).

* I think the difference between Marar's thesis and what I wrote earlier is that I think happiness is an instrumental good which motivates you to achieve your various ultimate goals. Marar assumes that happiness is the ultimate good, and furthermore that achieving this state requires an oscillation between two particular instrumental goals (freedom and justification).

Thursday, October 23, 2003

Why we need elective calliagnosia

The Higher Education Chronicle notes that good-looking professors get better evaluations. Article ranges from hilarious...

Mr. Lang has always earned high marks from his students at Assumption College, but he doesn't consider himself a "Baldwin" (for the clueless, that's a term for a hot guy, popularized by the movie Clueless). Apparently, though, some of his students do. More than one of them has made comments about his "buns" on student evaluations.

Now the assistant professor of English says he's self-conscious about his looks and his teaching. "I work very hard at my teaching," he says, "and I am a little disturbed at the possibility that students are evaluating my courses based on such a superficial criterion." He wonders if he's as good a teacher as he thought he was, and he's afraid to turn his back to his classes to write on the chalkboard.

...to surprising...

Some male professors also may be dismayed about another finding of the study: "Good looks generated more of a premium, and bad looks more of a penalty, for male instructors," say Mr. Hamermesh and Ms. Parker in a paper about their findings, "Beauty in the Classroom: Professors' Pulchritude and Putative Pedagogical Productivity." According to their data, the effect of beauty (or lack thereof) on teaching evaluations for men was three times as great as it was for women.

I'll do what the Chronicle does not, and link to the original academic paper: summary at NBER; alternate link.

BTW the title of this post comes from a term used in Ted Chiang's marvelous story "Liking What You See: A Documentary", which appears in his collection Stories of Life, and Others (available in hardcover and paperback).

Tuesday, October 21, 2003

David Foster Wallace on Infinity

I'm sad to report that Quicksilver was a bit of a disappointment (maybe one of these days I'll write up why), but today I found something new to look forward to: David Foster Wallace's latest, Everything and More, a discourse through the history of the concept of mathematical infinity.

DFW's pretty much my favorite author these days. He's not only an excellent writer; he's uncommonly perceptive in thinking about writing: what makes writing work, the relationship between writers and readers, and the place of writing in our larger contemporary culture. Also, as an undergrad majoring in philosophy, he specialized in mathematical logic and semantics; maybe it's because I double majored in English and computer science, but there's something about the way his mind turns around in his writing that really clicks with me.

Of course, reading the new Wallace book is yet another claim on my copious free time. Sigh.

J. B. DeLong weighs in on globalization

A while back, I was quite surprised by leftist George Monbiot's series of articles ([1], [2], [3]; originally from The Guardian) defending the WTO. Now, it looks like Brad DeLong is weighing in on the side of globalization, with a long quote from a Nation article by Doug Henwood:

As the results of the ministerial show, the WTO was never really the institution its critics said it was. From the outset, it wasn't really dominated by big capital in the rich countries. It's a one-country, one-vote system, like the UN's General Assembly. The rich countries, especially the United States, don't like this arrangement. They prefer the Security Council, with its big power vetoes. The United States is especially fond of the structure of the International Monetary Fund and World Bank, where votes are weighted roughly by GDP, giving the United States a 17 percent share of the vote and an effective veto. The rich countries finance the various institutions in revealing ways. At the Bank and Fund, both salaries and headcounts are high. The WTO has a small staff that's engaged in industrial action over pay and working conditions. As Columbia University economist Jagdish Bhagwati points out, the WTO's entire budget is smaller than the IMF's travel budget.

What might a weaker WTO mean? There was no sign of disappointment coming from the Bush Administration: US Trade Representative Robert Zoellick was quite optimistic after the talks collapsed. Zoellick hopes to induce a regime of what he calls "competitive liberalization," with countries eager for access to US markets fighting among themselves to please Washington. The US government is happy to negotiate separate deals with individual countries; it's always going to be the stronger party in any bilateral conversation. A weaker WTO will only stimulate the Bush Administration's unilateralist lusts. One of the organizers of the Cancún demonstrations told me people in the streets knew that what they were doing would strengthen the United States, but they wanted to damage the WTO regardless.

DeLong himself adds:

At one level, I want to tell Doug that although he is very welcome, he is about four years late to this particular party. Back when he was having his exhilarating time in Seattle, the protesters included:

  1. Hollywood workers who objected because NAFTA did not prohibit the Canadian government from subsidizing Canadian culture.
  2. NGOs that argued that Mexico's urban poor should under no circumstances be allowed to buy cheaper tortillas made from Iowa corn.
  3. U.S. steelworkers who argued that Brazilian steelworkers needed to lose their jobs, now.

With no vision of what a better world would look like, the "anti-globalization" movement was from its birth doomed to become the puppet of whatever particular bunch of special interests catches their fancy--whether it is U.S. steelworkers who want Brazilians and Koreans to lose their jobs, subsidized Korean farmers who want to keep Filipinos and Indonesians poor, Louisianians who are upset by imports of Vietnamese catfish, or whoever.

DeLong's last paragraph overstates the case against the anti-globalization movement; but as of now, I am officially an "off the reservation" liberal w.r.t. globalization and the WTO. Global trade is a good thing. No Brazilian or Vietnamese company is going to beat Intel or AMD at making computer chips anytime in the foreseeable future; they won't even come remotely close. Ergo, the only way for such nations to acquire a competitive information technology infrastructure is global trade. The same holds for countless other kinds of goods and services.

Given that global trade must exist, some international entity must govern it. The alternative is a power vacuum which the most powerful player (viz., the USA) will fill. Given that some governing entity must exist, it may as well be the WTO as anything else --- as Monbiot points out, the IMF and the World Bank are fundamentally less egalitarian by design; and the anti-globalization forces haven't put forth a better alternative, unless you think anarchy is a better alternative. The right goal is to reform the institution, not to tear it apart.

Sunday, October 19, 2003

Notes on Kill Bill

Saw Kill Bill last night with SL and J (coincidentally, it looks like MS did too). The advance buzz has mostly been about the excessively gory, relentless, and conscience-free violence, which I did find annoying and cheap --- in a culture where a murderous robot can be elected governor, it doesn't take much imagination or courage to make a movie that merely depicts even more trivialized violence --- but I thought the real failure was that the action simply wasn't shot that well. Yes, I know this seems like a ridiculous assertion, given Tarantino's obvious visual fluency in general, but bear with me.

Tarantino shares the misapprehension, disappointingly common among American blockbuster directors, that the point of cinematic action is its result: he's quite careful to convey that the table gets knocked over and smashed up, or that so-and-so's body part gets hacked off, or that the blood gets splattered against the walls. But he fails to portray effectively the motion of the human body, which, for me, is what it's all about. Action is a form of dance. Well-executed action has a certain rude grace, and for me it's only satisfying when the film captures that grace visually.

It's instructive to compare the fight scenes in Tarantino's film with those in The Matrix or Crouching Tiger, Hidden Dragon. Of course, those other movies had quite different aesthetic aims, but all three share the same choreographer; and the Wachowski brothers and Ang Lee were both, in their own ways, vastly better at showing you the action than Tarantino. When Morpheus and the Agent are fighting on top of the 18-wheeler, you never doubt for a moment exactly how the Agent knocks Morpheus over the side. You grasp every instant, every block and punch and reversal, and you can appreciate the impact of each individual motion. Likewise with Jen Wu and Shu Lien's first fight, following the rooftop chase sequence; this fight was additionally masterful because the specific motions of the fight/dance were actually an embodiment of the differences between their characters. Swashbuckling, romantic rebel Jen Wu repeatedly tries to take flight, and is pulled back to earth by Shu Lien, whose misguided attachment to tradition later dooms her own love. Compare these to Kill Bill's climactic showdown between O-Ren Ishii and Black Mamba: the swords clash (you can't tell how they clash; you see steel flashing and you hear the clang, but you don't know what's really going on); Uma Thurman falls; Uma gets up; the swords clash again; Lucy Liu gets scalped; game over. The difference between Ang Lee and Tarantino is like the difference between a comedy with genuinely funny, sharply-written dialogue, and one where the dialogue makes no sense but you know it's supposed to be funny because of the laugh track.

In short, Tarantino's movie depicts not so much action itself as the idea of action. Tarantino makes all the right manneristic cinematic gestures associated with action, and he positively relishes the gory result of action, but ironically he doesn't have much sensibility for the action itself. He's better at showing gushing blood than bodies in motion.

Now, all that said, Kill Bill wasn't exactly a bad movie. The music was great, and the film's packed with visual style. There's a cool anime sequence, in which the action's actually quite effectively conveyed (which is ironic, since there's much less motion in this sequence than in, say, the big swordfight in the club). But in a movie with so little of dialogue or character, the action's all there is; and if that's not spectacular, then the movie can't really be entirely satisfying.

Saturday, October 18, 2003

Song lyric of the day

We communicate more and more
In more defined ways than ever before
But no one has got anything to say
It's all very poor
It's all just a bore

--- Stereolab, "The Seeming and the Meaning".

You'd almost think they're talking about the Internet, except this song dates back to 1992.

Monday, October 13, 2003

Bestest post office on Earth to become train station

Maybe I'm just a sentimental expatriate New Yorker, but: *sob* they're taking away the post office on 33rd and 8th! This was the only public post office I knew in the metro area where you could send packages through USPS really late at night. Also, ever wonder how that line about "neither snow nor rain" came to be associated with the mail?

The building, which stretches across two city blocks, with a grand sweep of granite stairs rising to a Corinthian colonnade, will forever be linked to postal lore because of the engraving that runs above its 280-foot frieze: "Neither snow nor rain nor heat nor gloom of night stays these couriers from the swift completion of their appointed rounds." The quotation, inspired by Herodotus, was selected by the building's architect, William Mitchell Kendall, and over time became the postal service's unofficial motto.

As if Republicans weren't evil enough --- it's their obscene grave-dancing national convention that's providing the wedge to get the post office out.

Sunday, October 12, 2003

Wage discrimination and gender

Alas, a blog's recent series on wage discrimination and gender is quite informative; I hope someday the Alas folk will package this series up as a PDF, like David Neiwert's Rush series. The point that hits closest to home for me, as an apprentice scientist:

What the Nature study did was examine productivity (measured in terms of publications in scientific journals, how many times a person was a "lead author" of an article, and how often the articles were cited in scientific journals) and sex. Publication in peer-reviewed scientific journals is often considered to be the most objective and "concrete" sign of accomplishment in the sciences. These factors were then compared to how an actual scientific review panel measured scientific competence when deciding which applicants would receive research grants. Receiving grants like these are essential to the careers of scientific researchers.

The results? Female scientists needed to be at least twice as accomplished as their male counterparts to be given equal credit. For example, women with over 60 "impact points" - the measure the researchers constructed of scientific productivity - received an average score of 2.25 "competence points" from the peer reviewers. In contrast, men with less than 20 impact points also received 2.25 competence points. In fact, only the most accomplished women were ever considered to be more accomplished than men - and even then, they were only seen as more accomplished than the men with the very fewest accomplishments.

It probably wouldn't surprise the average layperson to learn that computer science is an overwhelmingly male profession. A 9-to-1 male to female ratio is a typical ballpark figure at all levels, from undergrad major enrollment through junior faculty, though it gets worse as you go up the ladder. Furthermore, my own specialization (programming languages and tools) is even more overwhelmingly male. When I go to the top conferences*, there will typically be a mere handful of women in a room of two hundred researchers.

I don't think these population numbers reflect disciplinary sexism. In fact, I think my profession's less sexist, on average, than society as a whole; and certainly the individual women researchers with whom I'm familiar are quite well-regarded. But the sobering studies cited by Alas do reflect disciplinary sexism (in science in general), and should lead us all to question the way we approach women and their work. Sexism in this context won't generally be a conscious act --- it could be a mere statistical differential in the probability that we'll cite particular papers, or chat with particular individuals at conferences, or chat about particular people's work with other people (the latter two are quite important for generating "buzz" around people's work).

This leads to an interesting question: should scientists practice a form of "personal affirmative action" for women researchers, whereby we consciously make an effort to pay extra attention to women and their research?

* Note for any curious scientists in other fields who may be reading this: conferences are a much bigger deal in computer science than in most other sciences. The year-or-more lag time for reviewing journal papers means that cutting-edge research is basically never published in journals. If you want to keep up, you must attend conferences, and publish your best research in them. As a result, the bar for conference publications is also much higher than in other fields: the top conferences in programming languages are well-known for rejecting even quite strong submissions because the competition's so intense.

Once you get a paper at a good conference, you may fill out the paper with all the details and tedious proofs, and attempt to publish the extended version in a journal; but this step's not strictly necessary to build a reputation. A fresh Ph.D. could get a faculty job at a top-ten institution without a single journal paper.

Cult of Shirky holds forth on future of file sharing

The latest NEC-list post discusses the probable future of file-sharing behavior. NEC (Networks, Economics, and Culture) is a low-traffic mailing list authored by Clay Shirky, who's something of a cult guru in new media circles. When I was an undergrad, I worked for a company that did an earlier iteration of his website; he was regarded by others with a curious reverence that nobody in our company could quite understand or justify. Anyway, he's interesting enough, and the list low-traffic enough, that you have no good reason not to subscribe to it.

Saturday, October 11, 2003

On Lost in Translation

Saw LiT last night, with SL and others. If you follow film at all, you've probably heard all about it by now, so I'll just write that, frankly, I don't see the big deal. It's a good movie, sure, but I've seen plenty of better movies in the past twelve months. The film's pacing and narrative drive seemed not only slow (I have no problem with slow movies; see the links in the previous sentence) but lazy. Many reviewers have found ways to excuse this quality, but I find their assessments dubious. Take, for example, Salon's Stephanie Zecharek:

The picture's muted intensity isn't just a vague mood -- it's a subtle but very specific type of narrative drive. Coppola (who also wrote the screenplay) is a stealth dramatist: Instead of unfolding in precise pleats, her movies unfurl like bolts of silk. There are no handy place markers between scenes to help us tick off how many minutes are likely to pass between this or that point of conflict and the denouement. Revelations don't click into position; they swoop down, seemingly from nowhere, and settle in quietly, like a bird coming to roost.

To some people, this is a maddeningly diffuse type of filmmaking, but I'd argue that Coppola's precision is simply the sort that's measured in sine waves, not milliseconds.

That's pretty much all bull, but the last sentence is especially vacuous --- Zecharek's straining for a metaphor whose vagueness underscores the very point she's trying to refute: namely, that the movie's fat and fuzzy around the edges.

A film about messy, vague feelings does not itself need to be messy and vague. Steven Soderbergh would have pared this movie down to its sharp and devastating bone, and given you a boatload more visual pleasure to boot. One might hesitate, at first, to compare Sofia Coppola's second film with the best works of an accomplished master like Soderbergh. But with critics like Zecharek and Roger Ebert (among many others) calling this movie an instant classic, it seems only fair to ask the question: how does it stack up against great movies that received similar, or even lesser praise? And the answer, when considered dispassionately, is that it suffers by the comparison.

It's better simply to say that Lost in Translation's a good movie, of the sort that, absent the present groupthink hype, would normally make a splash on the festival circuit and fade away gracefully. Bill Murray and Scarlett Johansson are both terrific, as usual. Plus I'm a sucker for picturesque nighttime cityscapes, and the Tokyo presented here fills the screen admirably. The film offers many pleasures, but those pleasures are muted and diffuse, like the film itself.

Thursday, October 09, 2003

In case you doubted the Republican Party's insanity...

...K. Drum will set you straight.

Does rejection hurt? (Literally?)

In Science 302:290-292, N. Eisenberger, M. D. Lieberman, and K. D. Williams have shown that social rejection stimulates the same part of the brain as physical pain (journal abstract; HTML journal article (subscribers only); free 131KB PDF at UCLA):

A neuroimaging study examined the neural correlates of social exclusion and tested the hypothesis that the brain bases of social pain are similar to those of physical pain. Participants were scanned while playing a virtual ball-tossing game in which they were ultimately excluded. Paralleling results from physical pain studies, the anterior cingulate cortex (ACC) was more active during exclusion than during inclusion and correlated positively with self-reported distress.

Little did the reindeer understand how deeply they were scarring R. (allusion via)

(via today's Science Now; subscribers only)

UPDATE: New Scientist has further coverage.

Wednesday, October 08, 2003

Video games, ugly buildings, and whores...

...all get respectable if they last long enough (apologies to N. Cross). The Times is paying attention to video game music, although, predictably, they focus on the recent big-budget productions (which, after all, provide the news "hook") and overlook older, smaller gems that made the medium's limitations into virtues. My musician friends would no doubt gag at the following litany, but I'll list a few of the game soundtracks that rang my personal chimes...

First, there's the shockingly diverse music from Star Control 2 (a game which I loved when I was a teenager, and which some cognoscenti class among the greatest games of all time). This bizarre alien-planet bachelor pad music spans and explodes nearly every genre of techno, from ambient to house to breakbeats to weird-ass shit that doesn't even have a name.

Then there was the (much smaller) Privateer soundtrack (alternate links: [1], [2]), whose best pieces were essentially dime-store ambient knockoffs of the soundtrack Vangelis composed for Blade Runner. Call me crazy, stupid, and deaf, but I think there's something genuinely evocative about these minimalist pieces, something that speaks of loneliness and vast distances.

Finally, there's Grim Fandango (unofficial site), which had, among other things, some terrific jazz numbers. Grim Fandango was also a superb game in so many other ways --- the distinctively surreal visual style, the memorable characters, the well-crafted story and dialogue --- that it's one of the few games I've ever played that I'd cite as an example of the storytelling art, on a par with a film or a novel. It's essentially a comic picaresque --- a genre whose "lightness" may lead people to underestimate the work's brilliance --- but it's a superb example of the genre. And, if I recall correctly, at least one other historical medium first discovered its voice in the picaresque.

While I'm at it, I suppose I should also point to OverClocked ReMix, whose existence indicates that at least some electronic musicians were even more taken with computer and video game music than I was.

Anyway, I've basically quit playing video and computer games --- my life's too crowded these days --- and, objectively speaking, I have to admit that if I'd spent those hours of my youth doing something more productive, my life would have been qualitatively better. But, really, that's hardly less true of most people's childhoods. As a kid, you spend too much time watching teevee, or reading junky genre fiction, or playing video games, or (these days) on instant messenger or blogs. The best you can hope for is that you find some accidental beauty along the way.

Monday, October 06, 2003

Speaking of happiness, here's some music

Got my brick of CDs from cheap-cds.com today. So far I've listened, with great pleasure, to:

  • Stereolab's Peng!: truly brilliant. For context, so you know where I stand on the Stereolab canon: IMO Sound-Dust good, Dots and Loops not-so-good, contrary to the opinions of most critics.
  • Spiritualized's Let It Come Down, also insanely beautiful. Nobody mixes krautrock, gospel, noise-rock, orchestral strings, and epic crises of faith like Spiritualized frontman J. Spaceman; from I Didn't Mean to Hurt You:

    I love you like I love the sunrise in the morning
    I miss you like I miss the water when I'm burning.

    When he sings it, it doesn't sound corny; it sounds glorious. It even gets to me in spite of the fact that I'm usually rather hostile to religion, which is one of J. Spaceman's major themes.

Shiny happy transhumans holding hands

In my previous posts on happiness, I said that people only value happiness if it's achieved in a fashion consistent with their ultimate values. To tie this back thematically to transhumanism --- this is yet another reason that some people will choose not to undergo radical alterations, regardless of how happy or powerful the transformed beings become.

Some people value their humanity, however "irrational" that attachment, and any future transhuman society must respect the choice of some people to remain "merely" human, or "merely" anything else. I'm pretty open to radical alterations --- I once told my friends that I thought it would be cool to have my consciousness uploaded into a star for a couple of billion years (yes, that's right, a star, as in a gargantuan flaming ball of plasma; assuming, of course, that you could build a computational substrate capable of supporting consciousness in such a medium) --- but even I have limits. I do not want to be forced to exceed those limits, either by direct coercion or by competition for the resources I need to survive. (Incidentally, the last of these requirements implies that an ethical transhuman society must, to some extent, provide a welfare state for ordinary humans; as I've noted previously, it seems probable that transhumans will outcompete humans in every field of endeavor, making direct economic competition ruinous for humans.)

Foolproof, reversible no-baby treatment for men

Finally. (Alternate link at ABC Australia; also look for this result in a forthcoming JCEM, though the relevant issue doesn't appear to be online yet.)

Well, it'll be some years before this treatment becomes "productized", but it's still heartening. By the time it's been through the development process, I also expect they'll have the implant and shot on synchronized schedules, so you'll only have to go to the doctor 4 times a year instead of 7.

Obviously, it doesn't do anything for STDs, requires advance planning, and likely has a relatively large up-front cost compared to more casual methods like condoms. Therefore (warning: layperson's wild-ass speculation coming up), it probably won't much reduce the societal incidence of unwanted pregnancy --- most people would only use this as the primary contraceptive in the context of a long-term relationship, in which case they'd generally already be using something else if this weren't available. But, it will make couples' lives easier, reduce the pressure on women to be "the responsible one", and give men a near-foolproof veto on unwanted paternity, all of which are good things.

Also, female chemical contraceptives have always had unpleasant side effects for a significant minority of women, probably because a woman's hormonal cycle and reproductive apparatus are simply more complex than a man's. Extended studies will probably uncover side effects for some men, but I'm willing to bet it will be a smaller fraction, with less serious side effects on average, than for women on the Pill or Norplant.

My main question: why did it take so damn long? The treatment doesn't sound too terribly complex. Why did biologists figure out the Pill decades ago? Is the delay a product of social forces or scientific limitations?

UPDATE: New Scientist has further coverage, including the following:

But Anna Glasier, at Edinburgh University's Centre for Reproductive Biology, says it remains to be seen how acceptable couples find such treatment. She and her colleagues carried out an international survey of men and women's attitudes to male contraception.

"We found that the majority of men would prefer a pill [rather than injection or implant], but testosterone cannot currently be made in this form. So, I am not sure how successful the Australian treatment would be," she told New Scientist.

Odd. I can see the appeal of not having to go to the doctor, but the sheer convenience of only dealing with contraception four times a year seems really attractive to me.

Sunday, October 05, 2003

Anything Else

Saw Anything Else last night with SL, AM, and TM. I've seen most of the features Woody Allen's made in the past eight years (except: [1], [2], [3]), and this is the best --- which perhaps isn't saying much, so let me give it higher praise: it's genuinely funny, and although it's vastly inferior to his two late-70's masterpieces, it echoes them with a certain satisfying symmetry.

Unfortunately, the movie wasn't marketed properly, and hence has been a disaster at the box office. It didn't help that many of the reviews have been mysteriously vicious, which I will attribute to some kind of hype-driven critical groupthink: these days it's just not cool to have any opinion about Allen movies, besides "Oh, yeah, well, he's really lost his touch, hasn't he? Oh, and there's his embarrassing personal life."

Wednesday, October 01, 2003

Dorktober report

Spent the evening attending the fall's first Dorkbot Seattle meeting, at CoCA. Tonight's program, from the dorkbotsea-announce list:

This month, Seattle artist and weaver LAURA MACCARY and her father and collaborator, Spokane-based sculptor, poet and electronics expert LAWRENCE MACCARY will be talking about the hows and whys of their "Dialectric" series of interactive artworks, which all consist of an electronic component woven of conductive or resistive material and a circuit designed around the weaving (http://www.maccary.com/ ); TOBY PADDOCK will explore "Magnetic sensors for the unwashed and lazy:  using Hall Effect devices without learning very much".  inside a magnetic sensor is a lot of science, but outside they're small, cheap, rugged, have few wires and are easy to get along with (http://www.seanet.com/~tpaddock/).  Local experimental music duo PINKY & REX will discuss their re-contextualizing of thrift-store toys, toy instruments and electronic oddities to create symphonic walls of sound.

IMO Laura MacCary's weaving-based sculptures were very cool. Pinky and Rex's musical performance was intermittently interesting, though it dragged at times (inevitable, perhaps, with improv techno performed on toy electronic guitars). The rest of the program, so-so. Anyway, dorkbot's weird and unique enough that I think I'll be going to future meetings.

Tuesday, September 30, 2003

Humans: Prepare to meet your strange new peers

(More bookmarks cleaning.) Hans Moravec asks: When will computer hardware match the human brain? (Interested readers should definitely read the commentary as well.) The interesting thing about Moravec's analysis is that it is immune to a couple of the standard objections:

  • Standard Objection 1: "AI researchers have been promising imminent breakthroughs for the past 50 years. This is just another wildly optimistic prediction." Moravec's analysis notes that, although AI research has been conducted for 50 years, the amount of funding available to AI researchers has decreased dramatically, with the result that AI researchers have had nearly a constant amount of hardware power at their disposal until fairly recently. He also argues that AI strongly tracks advances in hardware --- the AI software you write for a 100-MIPS computer is not just a bigger, faster version of the software you write for a 1-MIPS computer, it's designed fundamentally differently. Multi-million-MIPS computers will enable vastly new approaches that enable quantum leaps in AI power.
  • Standard Objection 2: "Neurons are far more complex than is popularly believed. Your estimates of brain complexity are far too conservative." This is irrelevant to Moravec's analysis, which relies on an analysis of a functional unit of the nervous system, not raw neuron counts:

    More computer power is needed to reach human performance, but how much? Human and animal brain sizes imply an answer, if we can relate nerve volume to computation. Structurally and functionally, one of the best understood neural assemblies is the retina of the vertebrate eye. Happily, similar operations have been developed for robot vision, handing us a rough conversion factor.

    ...

    It takes robot vision programs about 100 computer instructions to derive single edge or motion detections from comparable video images. 100 million instructions are needed to do a million detections, and 1,000 MIPS to repeat them ten times per second to match the retina.

    The 1,500 cubic centimeter human brain is about 100,000 times as large as the retina, suggesting that matching overall human behavior will take about 100 million MIPS of computer power.

    In Moravec's analysis, it's irrelevant how computationally complex individual neurons are. The volume of neurons in the eye performs a visual processing task that can be simulated by 1,000 MIPS. Regardless of how these neurons accomplish this task, it appears that 1,000 MIPS is roughly adequate to replace the function of that volume of neural matter. It stands to reason that other functional units of the brain would require similar amounts of computing power per unit of volume.

    Of course, neurons may be more densely packed and interconnected in the brain proper than in the retina, but probably not by more than an order of magnitude or two. Assuming continued exponential growth in computating hardware, a mere order of magnitude only signifies a few years' difference: perhaps it will take fifty-three years instead of fifty. Hardly a basis for pessimism.

I don't buy Moravec's analysis entirely (there are some obvious objections, although no fatally conclusive ones, to Moravec's analysis) but it's an interesting read nonetheless.

Happiness Considered Merely Instrumental

In comments for my earlier post, Inky asks:

Then what is the ultimate aim? Surely you're not postulating that the ultimate aim of *most* people is to lead meaningful, productive lives of personal integrity beyond mere happiness?

Since Haloscan appears to trash old comments after a while, I'll reply here, because I want to record this indefinitely.

As I state in that comments thread, I don't believe that the things people value besides happiness are necessarily noble. They can equally well be craven, or egotistical, or founded on irrational hate. But the point is that people do have attachments to values that would not be called "happiness" by any usual definition of the term "happiness". People want happiness, but they find happiness valuable only insofar as it is obtained in a manner consistent with their ultimate values.

For example, all human beings could become "happy" by giving themselves a lobotomy. Most humans would not, however, choose to become happy in this fashion, because the thought of being a happy drooling idiot is somehow unattractive --- even though that drooling idiot would have no conception of the inadequacy of his/her happiness. This observation doesn't paint humanity as particularly noble, or possessed of very much integrity, but it does illustrate that the experience of happiness alone is not the ultimate aim.

So what is the ultimate aim? Perhaps it is to be happy for the right reason. But in this case we can simply drop the happiness as an end-in-itself, and understand happiness as a motivating force that drives us to bring about those right reasons. In Aristotle's classic categorization, happiness is therefore an instrumental good, not an intrinsic good.

Interestingly, this conclusion has some vague, poetic resemblance to recent findings in neuroscience (via Brain Waves):

George Loewenstein then explains: ''Happiness is a signal that our brains use to motivate us to do certain things. And in the same way that our eye adapts to different levels of illumination, we're designed to kind of go back to the happiness set point. Our brains are not trying to be happy. Our brains are trying to regulate us.''

As usual, there's a couple of related Wikipedia articles: Goodness, Happiness.

Decline of Magazine Cover Design

Weep for the art of magazine covers.

(Cleaning out the bookmarks again; I believe I got this through Plastic or MeFi a long time ago.)

Women admit more partners when on "lie detector"

Dove into Google today to look up an article I remembered reading a while back, and came up with...

The way this story got spun by the different news outlets is as interesting as the result itself. Compare the New Scientist's relatively sober headline ("Fake lie detector reveals women's sex lives") with that of the finger-pointing Washington Times ("Women said more likely to lie about sex details").

Incidentally, during my Google-trawling, I also turned up an initially rather counterintuitive study on self-esteem and intercourse in teenage boys vs. girls (plus more interesting readings at The Psychology Student).

Twilight for the Orangutans?

I had no idea that orangutans were in so much trouble. Their loss would be, aesthetically speaking, a great tragedy --- as C. McGrath writes:

Anyone who has watched much nature television knows that orangutans are by far the handsomest and smartest-looking of the great apes. They're literal highbrows, with wide, soulful eyes and broad expressive foreheads. They're covered not with bathmat fur, like so many apes, but with what amounts to a couture pelt -- red hair so long and fine it seems blow-dried. It's true that orangutans drag their knuckles when they walk, but how else are you going to get around if your arms are longer than your legs? For creatures so large, they are uncommonly graceful, not to mention sweet-natured, so it's gratifying to learn that a team of scientists, writing in the journal Science, has recently certified them as ''cultured'' as well.

Well, I wouldn't go quite so far as to call them the smartest-looking. We're great apes too. OTOH they're probably smarter than baboons...

When baboons hunt together they'd love to get as much meat as possible, but they're not very good at it. The baboon is a much more successful hunter when he hunts by himself than when he hunts in a group because they screw up every time they're in a group. Say three of them are running as fast as possible after a gazelle, and they're gaining on it, and they're deadly. But something goes on in one of their minds-I'm anthropomorphizing here-and he says to himself, "What am I doing here? I have no idea whatsoever, but I'm running as fast as possible, and this guy is running as fast as possible right behind me, and we had one hell of a fight about three months ago. I don't quite know why we're running so fast right now, but I'd better just stop and slash him in the face before he gets me." The baboon suddenly stops and turns around, and they go rolling over each other like Keystone cops and the gazelle is long gone because the baboons just became disinhibited. They get crazed around each other at every juncture.

But I digress. (The Sapolsky link is just way too cool to pass up.) I don't seriously think that online petitions are likely to do anything, but maybe the Orangutan Foundation International (donation link; Charity Navigator report), the International Primate Protection League (donation link; Charity Navigator report), or BOS-USA (no report available) have some better ideas.

UPDATE: In comments, Inky points to the world's most endangered animals, plus another link concerning culture for those who want more than McGrath's article provides.