Popular Posts

Tuesday, December 30, 2008

Margin of Error

I've started investigating the predictions made by GW believers, and in particular how the predictions have changed over the years. It turns out that the predictions, as I expected, are all over the map: the climate will get hotter in the next 100 years, but who knows by how much, and who can say what'll happen? This isn't of course how it's pitched. But it's what the numbers are telling us. Allow me to explain.

The IPCC (Intergovernmental Panel on Climate Change) predicted a rise of mean global temp by 9 to 12 degrees Celsius in 1990, then .8 to 3.5 degrees Celsius in 1996, then 1.4 to 5.8 Celsius in 2001. There are a couple of points here. One, the absolute differences between the three predictions are huge, a fact that ought to worry anyone who's invested much of their intellectual energies into believing that we know what we're saying. Two, the relative differences (the low to high predictions for each year) are huge, as well. In the 2001 predictions, converting to Fahrenheit gives us a range of 2 to 10 degrees. What the heck? This effectively says nothing; even simpleton GW skeptics like me can see that an eight degree range allows for vastly different weather scenarios.

We might forgive the blatant variations in these predictions by noting that they all point to some warming trend (although, after my unscientific survey of this debate, I'm inclined to believe that the climate will be cooling, not warming, in the next 100 years-- but who knows?). True. But I wouldn't plan your picnic around these numbers, because if you look at them, they tell a clear story: who knows?

Source: http://weathersavvy.com/Q-Climate_Global_PredictionsAccurate.html

Monday, December 29, 2008

Ms. Buttu Says All

This Hamas Israel thing. On CNN Rich Sanchez interviewed first the Israel ambassador to the UN, then former PLO legal advisor Diana Buttu, to get both sides, you know. American above-the-fray media. Wow. These folks know how to bicker. Screw the facts. Ms. Buttu, in response to Sanchez's question "But don't the Israelis have a right to defend themselves?", makes a couple of interesting points, that I'll take the time to dissect.

One, the rocket attacks from Hamas didn't have "explosive heads", unlike the Israeli rockets.

Oh, yes, Ms. Buttu, you went to law school to say that? Nice. As if Hamas was mindful of Israeli lives, pulling off the war heads from the rockets before lobbing them willy-nilly into civilian neighborhoods, soccer fields, etc. Wouldn't want to unduly injure anyone. Is she serious? I'm pretty sure if Hamas had a Number 2 pencil with a nuclear tip they'd figure out a way to smuggle it into an Israeli grade school in hopes of exterminating some Jewish children. Give me a break.

The reality is, Hamas is literally throwing missiles into civilian areas of Israel in hopes of killing anyone. And Ms. Buttu, you know it. Shame on you. I'll be nice and merely give you the dumb ass comment of the year award.

Two, this lobbing of missiles into civilian neighborhoods is justified, because Israel has been waging a Nazi-like war against Palestinians, with military missions into Gaza, having the effect of cruelly denying Palestinians their freedom (how does that work?) Man, it sounds bad. But let me sum it up for those uninitiated into the perpetual Palestinian-Israel conflict: the Jews intend to live here! And they have a military! And they use it when we try to slaughter them! Damn Israel! Damn them when they strike back!

Great. So the best I can tell from Ms. Buttu's comments is that, after her barrage of legalistic, emotive words, the reality is that Israel is actually trying to be a sovereign nation in the Middle East, and it patrols the borders of the Gaza Strip, and actually intends not to perish but to try to insure the security of its citizens.

This conflict, if an alien were to come down and hear both sides of it dispassionately, would so far skew in favor of Israel's targeted strikes in reaction to the mayhem-intending Hamas actions that "hearing both sides" would become a joke. There's no equivalence, and the world knows it. Hamas hates the Jews. And how dare the Jews try to live near Hamas (or in the Middle East generally). Sanchez did his best to hear both sides; in the end, what I've said here is just exactly what both sides said. Sans perhaps the legaleeze.

Sunday, December 28, 2008

Saltless in Seattle

Seattle and in general the Pacific Northwest has been deluged this year with snow. The enviro wizards in Sea-town managed to dump tons of sand all over the streets to create traction for hapless motorists, a questionable tactic motivated largely by the desire to avoid using salt. Too bad sand is worse for the environment than salt (but, doesn't it seem that salt should be, well, worse than sand?). The experts have proclaimed that salt "degrades marine life", while not offering details. In the meantime, the six thousand tons of sand dumped over Seattle roadways are choking out the insect populations, negatively affecting local streams. Damn.

2008, Bummer for GW Believers

The planet cooled this year, compared with the last eight (which makes it accurate to proclaim "it's the coolest year in the 21st century!"). This is fine for a punch line. But what's the deal? The Guardian article cites a team of researchers from Kiev University that predicted, back in March, "...that natural variation would mask the 0.3C warming predicted by the Intergovernment Panel on Climate Change over the next decade. They said that global temperatures would remain constant until 2015 but would then begin to accelerate."

Great, but problem is, I've been Googling around and finding lots of cock-sure predictions by GW believers that the warming is already accelerating, not going into a flat period before it unleashes its fury sometime later (type in "global warming accelerating" for about a thousand assurances from the GW "experts" that we're screwed). So, are we leveled off until 2015, after which we'll begin our Warming Acceleration? Or are we accelerating right now, and 2008 is some weird anomaly, to be replaced by a warmer 2009, and an even warmer 2010, and so on? What the heck's going on? I'm sure the experts can explain.

The reality is, Global Warming is B.S. (oh, I mean IMHO it's B.S.), and the point will be made clear enough in the years that follow by nature itself. My guess is that we're headed for cooler temperatures this century. I could be wrong (of course), and given that I'm trying to maintain some degree of epistemic humility, I'll hold off, for now, launching a Web site dedicated to shaming everyone into investing in technologies that warm our planet and shield us from the new Ice Age to come...

Monday, December 22, 2008

Global Warming!

I have a puzzle about Global Warming! (I'm now including the exclamation to further capture the added drama that typically attends the phrase). It's the observation that C02 levels have, in times past, been high, yet the climate then was actually cooling. In other words, the temp graph was trending down as the carbon dioxide levels were trending up. The GW folks have some ready made explanations for this, mostly centered on the catch-all "it's complicated" dismissal (translation: you stupid skeptics, you're either not scientific or just plain crazy!), with perhaps some additional whiz-bang sciency sounding stuff about how A, B, C, and sometimes D can vary with levels of This, That, and The Other. Don't worry about all of this, however. Just remember that it's complicated. And butt out. (Of course, it's complicated should stick to the GW believers as much as to the skeptics. It's complicated is double-edged, after all.)

This debate, whenever I've had enough of the Dark Knight (will this movie ever end?) and I'm thinking about something that fires everyone up, but that smells like a three day old fish, I always end up back at Global Warming! And when the discussion ping pongs back and forth long enough to exhaust the easy points and counter points, we inevitably end up back at "But what are the costs of inaction?" (no, this isn't the runup to the Iraq War all over again). As if we'll all be for coal plants and China and pollution and Hummers unless we believe that we can predict the future of the weather.

I will end me post with this however: if anyone can tell me what the weather will be like in a few decades, I'd like to discuss the stock market. You might just be my best friend.

The Harley Dudes

I was driving on 183 a couple of days ago, and this gaggle of Harley bikers rumbled past me, leather jackets and babes on the back. I was doing maybe 70 in a 65 zone, and so the bikers must have had their hogs up to 75 or 80. It occurred to me, with the vibrations of their engines pulsating through the door of my Toyota and the Bon Jovi or whatever anthem rock I'd cranked up temporarily drowned out, that I never see these guys get pulled over for speeding. When has anyone ever seen a bunch of Harley dudes parked to the side of the highway, doing that give-me-my-ticket-so-I-can-leave shame thing? What's the deal? My theory is that they're too damn harley, to make the noun an adjective for present purposes; it's not in the fabric of things to have these guys getting written up by un-cool Johny Law. They're only popped if things escalate, like a knife fight in Vegas. Or if something goes down in Sturgis.

But I love these guys anyway. I just wish they wouldn't drown out my anthem rock when I'm pulling gears in my 6 cylinder Tacoma. We all need those harly moments.

Thursday, December 18, 2008

The Neanderthal Project

The NYT recently reported that DNA sequencing of the Wooly Mammoth genome is now possible, using two fossilized hair samples, recovered from mammoths that died 20,000 and 60,000 years ago. NYT reports that scientists are now discussing how to modify DNA in the mammoth's closest living relative, the African elephant, so that it resembles the wooly mammoth. The elephant genome, according to Stephan C. Schuster and Webb Miller at Penn State, will need to be modified at about 400,000 places to make it resemble its hairer cousin. As the thinking goes, once modified at these locations, the elephant genome will be, effectively, a woolly mammoth genome, which can then be brought to term in a female elephant. The elephant would have a wooly mammoth. This clever technique makes moot the prior thinking that a mammoth genome would need to be synthesized in the laboratory. No need to do this (and we can't anyway), because we've got the elephant's cell, and with the mapping of the mammoths DNA, we can translate the one to the other.

So far so good, but there's (or was) a hitch: 400,000 changes are a lot of changes, and the process will likely be arduous to the point of not feasible. Enter the "454 machines", which automate a revolutionary new sequencing technique, that, in effect, let biologists do the genomic modifications in batches. According to George Church, genome technologist at Harvard Medical School, about 50,000 "corrective DNA sequences" can be injected into the cell at one time. In this case, with only a few iterations the machines could inject the entire set of necessary modifications, making the science-fiction like scenario a reality.

The cost estimate for the wooly mammoth project is about $10,000,000, which, while not chump change, is a figure that gaurantees that someone with deep pockets and an interest in our archaelogical past will see things through.

As if this isn't zany enough, there are efforts underway to regenerate the Neanderthals, a hominid race closely related to homo sapiens (us) that lived approximately 200,000 to 45,000 years ago, inhabiting Europe, and possibly coexisting with our direct Cro Magnon descendents. No one knows, conclusively, why the athletic, possibly dim-witted, Neanderthals died out those thousands of years ago. We don't know whether they could talk, or to what extent they created a culture similar to early humans (there is evidence that they drew paintings, suggesting an ability to communicate abstractly). What is certain is that, if the sequencing techniques work on the wooly mammoths, there will be no scientific reason that they can't likewise be applied to generating Neanderthals, if (or when) the extinct species' full genome is recovered.

Work on The Neanderthal Project is well-underway. Svante Paabo of the Max Planck Institute for Evolutionary Anthropology, for instance, has been diligently reconstructing the DNA of Neanderthals using bone fragments discovered in Eastern Europe. With the help of the new "454" sequencing machines, he -- along with a similar project at Lawrence Berkely National Laboratory -- expects to get a complete Neanderthal genetic blueprint. In this case, just as with the use of elephants to birth mammoths, a Neanderthal could be delivered from a human female, or (perhaps less ethically questionable), a Chimpanzee.

As evolutionary biologist Hendrik Poinar notes, “The reality is it will happen,” ... “Twenty to 30 years is the span people are talking about.”

And what then? When a creature so like us -- but so different -- walked again among us, what then? Dartmouth College ethicist Ronald M. Green's comment is as creepy as it is probing:

“This was a species we competed with,” ... “We would not want to recreate a situation of two competing advanced hominid species.”

We may just find out.

A Cool Trillion

The front page of the Austin American Statesman sports this, in bold block letters:

A 1,000,000,000,000 plan?

I'll admit that I had a hard time parsing the digits at first. Turns out it's a cool trillion. A trillion dollar plan? Did I miss something? I know the unemployment rate is, what, 6.7%, but the breadline is hardly stretching around the city block. Maybe I suffer from a failure to predict the something economically-wicked that this way comes. Fine. But, a trillion dollars? (I love how it's a nice even number too, like the calculating master minds in the back gave up on more precision: "Hell, just make it a trillion. That oughta do it.") We're gonna break the freakin' printing press...

The only bright spot reading this trillion dollar plan scenario came, for me, when Speaker Pelosi went on record wanting something like 200 billion in tax cuts. We've got bipartisan consensus now (FWIW) that reducing tax burdens help stimulate the economy. Conservatives have been shouting this on a mountain for a very long time.

Wednesday, December 17, 2008

Edgar Allen Poe

I'll do this from memory but saw it originally years ago in some book about Edgar Allen Poe. Legend has it that it came originally from the morbid maestro himself, inscribed on the wall of a pub somewhere in Massachusetts:

Fill with mingled cream and amber, I will drain that glass again. What hilarious thoughts do clambor, through the chambers of my brain. Quaintest thoughts, queerest fancies, come to life, then fade away. What care I how time advances? I am drinking Ale today.

Save the Planet

Gore thinks he's saving the planet. I love this. The hubris. Most of us can't save our sandwich from getting stale on the corners. But Gore's got us frothed up about saving the planet. Well, hell, let's do it Al (say this in a slow Southern drawl). Let's save it mo fo. What to do first? How about, with every last penny, mobilize our forces to scare the beejeesus out of China, until they stop building those 1950s coal plants. That'd do a heckuva lot. Failing that, I'll put in those energy saving bulbs (I do use these, actually, but only because it makes good financial sense). How many flourescent bulbs does it take to cancel out a coal plant? Have to start somewhere, I guess.

Harry Reid

Like Mission Accomplished, Harry Reid delivers. At least President Bush later expressed regret about the "bring em' on" bravado and the Mission Accomplished banner (Bush might instead have fallen back on parsing words: "No, I meant the particular mission to take Baghdad in the first weeks of battle..."). Harry Reid, the [insert your favorite moniker], managed to proclaim to all the world just before the surge that the war is lost. Which war, Harry? Vietnam?

Gettin' it Straight

The Credit Crisis. Auto bailouts. Stock market in turmoil. What does it mean, and who has the answers? Let's start with some clarification of terms. Define your terms, as philosophers teach us. So I'll try to do just that, and I'll pick as my subject matter a set of terms that we constantly use, that have different meanings though it's common to view subsets of them as in essence the same (or at least having substantial overlap), and that are particularly germaine given our current circumstances. The terms are:

Douche Bag
Dumb ass

Let's get started. In what follows I'll define the term, then offer some exemplar from popular culture to tack down the definition. I'm confident that the conflations will melt away, to the edification of all.

First, a geek. Contrary to popular opinion, a geek is not a nerd. Don't conflate them, folks. A geek is someone that drills into a particular subject with a zest that borders on the maniacal. But (and this is important), he does his drilling at the expense of, say, hygiene, or social skills. The classic geek is the computer geek. Moooove. But geeks find homes in other technical disciplines as well. Nicholas Cage's character in The Rock, Stanley Goodspeed, was a geek.

The nerd. Ah yes, the nerd. Nerds are boring, unathletic types that tend to fall into nerdy routines (like getting up at the same time, having coffee, listening to some soothing music, leaving the house at the same time, etc.). Nerds are terrible with the opposite sex, tend to be abstemious (being too "smart" for vice), and are by definition socially unattractive to non-nerds and in particular to members of the opposite sex that are non-nerds. In other words, they're smart, with nothing else. Nerds. Classic nerd is Marty McFly, from Back to the Future. Also, Ross from Friends (though a borderline case, since Ross had a greater than zero chance that a woman would find him attractive).

Dorks. Dorks are nerds with less native intelligence. A dork looks and sounds like a nerd but talks about his car, or his recipe for Jalapeno macaroni salad. Dorks tend to watch a lot of sports on T.V., and may wear sports insignia, especially on dates, or to nice restaurants.

Tool. Interesting type, the tool. Tools are smart, mostly successful, with something more to offer than can be managed by geeks, nerds, or dorks. A tool will be at least one of: attractive, athletic, or sociable. Tools are tools essentially because they follow the rules. A tool made it into a good law school, dresses nice, may have an attractive mate, and refuses to buy beer for the neighbor kid (though he's known him for years). A tool may inform on classmates for cheating. He rarely speeds. Tools generally end up running things. (Don't worry, however, because they're still tools.)

Jerk. Low class, mean spirited, don't give a damn types. Chet from Weird Science. You can almost substitute "jackass" for jerk, though there is some small semantic difference.

Asshole. Jerks that have made something of themselves. Colonel Nathan R. Jessup from A Few Good Men. 'Nuff said.

Prick. Pricks are assholes with an innate and ineradicable sense of entitlement. They're high society about their assholiness. It's an important distinction. You can't call a prick an asshole without a palpable degree of imprecision: "No, he's a prick, Bob. Get it straight". Hardy Jenns, from the 1987 Some Kind of Wonderful, was a classic prick.

Douche Bag. The douche bag. This is generally a lower management type that makes everyone in the room (or office) go mum when he walks in. Like a Pez Dispenser he pops out company lines with a shitty smile, poring cold water all over your once hot but now extinguished conversation about the weekend's activities, or the new girl in office 101. The douche bag would be a standard issue nerd or dork, only there's an additional moral deficiency with the DB; the bag wants to climb up the management ladder, and at your expense. Go mum when you spot the Douche Bag. He's only gonna cause you pain. (The silver lining, however, is that DBs tend to get their comeuppance, having more desire for Machiavellian conniving than actual ability, and tending always to repel all things cool. Exemplar? Hard to find. Best that comes to mind is Carter Burke, the character played by Paul Reiser in Aliens. Douche bag. But anyway in spite of the dearth of DBs in popular culture, I know several from past jobs. I bet you do too.)

Dumb ass. Nerds and even dorks may have something to say within their sphere of expertise, but dumb asses, by definition, always come up short. Dumb asses speak, and every non-dumb ass starts an imaginary stop watch, waiting for the cessation of dumb ass sounds. The "Oh" guy from Office Space is a classic dumb ass.

Pinwheel. A pinwheel is someone who may or not be smart about something, but seems drawn, like a moth to a flame, to sound off about other subjects about which he has only enthusiasm without accompanying expertise. Pinwheels come out of the woodwork when discussions turn to politics. Classic pinwheel? There are lots. Clooney can be a pinwheel, as can "green" actors like DiCaprio. In fact, pretty much any Hollywood actor with strong views on political issues is sure to adorn him or herself with fluffy pinwheel attire. Ashton Kutcher sounds more than a little pinwheely at times. These pinwheels, the Hollywood variety (a common strain of pinwheel), all suffer from the false belief that their sheer attractiveness somehow promotes their opinions to bedrock truth. Maybe, but only for other pinwheels (this is key). Also, fourteen year olds.

So there you have it. In the midst of these dire times, a get to the point, hard hitting, good ole' fashioned linguistic analysis of the nouns whose referents we're seeing more and more of these days. Let's get it straight.

Friday, December 12, 2008

Stuck on a Hill

In artificial neural network (ANN) research, there's a well known problem of local minima (or maxima). I've worked a bit with ANNs but much more with a (superior) learning algorithm, Support Vector Machines (SVMs). Unlike the latter, ANNs require heuristics to "converge" on an optimal solution given some very large decision surface. The heuristics, to simplify a bit, are intended to get the algorithm to converge on a global, not local, solution. Local solutions are unwanted because they can appear to be global (good) solutions, given some snapshot of the decision surface, but in fact are very bad solutions when one "zooms out" to see the larger picture. This phenomenon is perhaps best illustrated with geographical imagery. If I am walking up and down hills, en route to a very large mountain, a local maxima might be the top of a foothill. But it would hardly be a global maxima, like the top of the mountain. The point is that ANNs can converge on the foothills, telling us that it's the mountain.

ANNs notoriously suffer from this limitation, but the "blindness problem" is endemic to all statistical learning algorithms, including SVMs (though, at least in theory, you can get an optimal solution with an SVM). Using such algorithms to learn from past experience (i.e., training data), you generate an approximation function for the underlying distribution you're trying to model. You see the function as worth the time it took to gather training data, select the features, and train, if it approximates pretty well the underlying target distribution you're interested in. You can tell if it pretty well approximates the underlying distribution if it keeps getting things right, and you don't have to keep making excuses for it (say, by saying that "it's complicated").

Anyway, we can view ANNs, SVMs, and other statistical learners as essentially inductive systems, in the sense that, given a set of prior examples, they learn a rule (classifier) that allows us to predict something about new, unseen examples. They generalize, in the sense that unseen examples that match a profile (learned from the training data), even if not an exact fit, may still be classified correctly. It's not a simple one-to-one match. Hence, generalize.

The problem with the generalization performance of all such systems is two-fold. One, they're limited by the set of features that were chosen (a person selects features that are relevant, like "is followed with 'inc.'" for classifying organization mentions in free text). Two, even given optimal feature selection, the generalization performance of inductive systems is always hostage to the information in the training data. We collect examples to use to train the system, and, in the end, we hope that the training data was adequate to model the true distribution (the thing we really want to predict). Problem is, whatever hasn't occurred yet in this true distribution, can't possibly be collected from past examples, and so the entire approach is hostage to the fact that things change in real-life environments, and the larger "pattern" of the true distribution as it unfolds in time may not be captured in the training data. Whenever this happens, the approximation function does not model the true distribution. Bummer.

Now, a feature in particular of such inductive systems (we can substitute "supervised statistical learning sytems" if it sounds more fancy) is this local minima or maxima worry, which I introduced with regard to ANNs, but which is really just a handy way of introducing the general problem of generalizing from past cases to future ones writ large. And it is a problem. Consider time sequence prediction (as opposed to, say, sequence classification such as the well-known document classification task in IR research). In time sequence prediction, the goal is take a sequence of elements at time t, and predict the next element at time t+1. Applying this multiple times you can predict a large sequence of elements through some time n.

And this is where the inductive problem comes in, because if the data you're using to predict the next elements came from some set of prior elements, it's possible that the prior elements (your training data), gave you a model that will get you stuck on a foothill, or, will see the top of a small hill as a bottom valley, and so on. You can't be sure, because the future behavior of the true distribution you don't have. And this is why, in the end, induction--however fancy it gets dressed up in mathematical clothing--not only can be wrong, in theory, but often is, in practice.

We can't see into the future, unfortunately. If we could, we could fix the inductive problem by simply adding in the additional information about the true distribution that we're missing in our training data. But in that case, of course, we'd hardly need the approximation.

Wednesday, December 10, 2008

Rule Following

"This was our paradox: no course of action could be determined by a rule, because any course of action can be made out to accord with the rule."
Ludwig Wittgenstein, Philosophical Investigations

"If the rule you followed brought you to this, of what use was the rule?"
Anton Chigurh, No Country for Old Men

One the great myths of modern society is that we're following rules to obtain outcomes. I mean rules, roughly, in the sense described here, although I'll also feel free, for these purposes, to equivocate a bit between plans and rules. No harm should be done for now.

So the myth of rule following. We see it in software development (no one seems to notice, or if they do, they dare not mention, that the "rule" was changed a thousand times between conception and completion of project), we see it in the economy, the social sciences, and indeed everywhere that the veneer of science and technology and the almost pathological need for certainty manages to obscure deeper truths about the fragility of our capacities.

Retrodiction, not prediction, is what we're best at, though it is unfortunately and for obvious reasons of little interest. And for various psychological reasons that I'm neither qualified nor interested in researching directly, we're strikingly good at painting failure after failure to predict what comes next with ex post facto explanations that make things just so. Political science is perhaps paradigmatic. It was common in the 1950s to prognosticate about how the (now defunct) USSR would be the preeminent superpower by the 1970s. France (yes, France) was widely thought to be emerging in the 1970s. Japan in the 1980s. China of course today. Our ability to keep proclaiming, generation after generation, our cock-sure predictions about the future state of human societies in the next year, five years, decade (or God forbid, century), is simply amazing, and defies logic. Yet we keep doing it. And we will keep doing it, in spite of all evidence of consistent failure to the contrary.

The psychology of rule following tells us that there's a rule (or a set of rules) that we followed to get to a result (or that will allow us to predict a future result). And when we achieve the result, we tend to confirm the application of the rule, when in fact (chances are) we've made innumerable on-the-fly judgements to get to our result, and then we've tidied things up after the result was achieved by giving credit to the rule. So everything fits. Feels like progress.

On the other side of the coin, when a result is not achieved, instead of recognizing the general problem of using rules, we tend to assume that the particular rule (we claimed to) use, was in fact not adequate. And we set about looking for a new rule, which will of course not be adequate in many contexts too. Such is the nature of our (unexamined) selves. In a deeper and more honest sense we might someday admit that progress (at least in messy, complex situations that we're immersed in), is mostly a function of insights, adaptive thinking as the environment changes, and, well, luck. But we don't see it this way. It doesn't sound like something an expert would say.

So I think that in complex systems (like the weather, or any system where human choice can enter in), our capacity to formulate generalizations that tell us how things will be at time t+n, when we refer to them at time t, is effectively a chimera (whenever n is large enough, which depends on features of the system). Things are constantly new, and different. We formulate plans, and rules, and they guide us, but very loosely, because the environment is constantly in flux. Rules we've grabbed onto "work", only because we keep adjusting things to make them seem to work. The real driver is rather our own wits and insight. And with these much more powerful tools, software does get developed. The Surge in Iraq works. The Space Shuttle (mostly) arrives at the Space Station. And when I get correct directions and follow them, I typically get where I'm going (even if unexpected snags happen). And on and on.

Anyway, in some other post I promise to explain in more depth exactly how rule-following is a mirage, which I haven't yet done (I've asserted mostly only that it is). To be continued. Until then, rest assured that our rules are grains of salt. They just masquerade as so much more.

The Sobriquet

She remained phlegmatic about her sobriquet long after her lover had bestowed it, and then began using it in earnest. But something changed. She noticed first that she was anxious to discuss it when around the table with her girlfriends, and later became horrified at the prospect of its slipping to third parties in public or semi-public moments. She was not a woman prone to obloquy, particularly against her lover, but in almost febrile moments she began to fret that he had drawn her into a miasma of his silly, redolent, refulgent phrases that he thought coruscated their union and she just his style. She didn't know any more. She thought maybe she didn't like her sobriquet. Maybe she hated it.

Exiguous complaints, she countered. And then the frustration would grow until it burst out in fissiparous fragments, and she'd retreat to lying motionless. Unthinking.

In other moments a more psephological mood would emerge, and she'd poll her constant and contradictory thoughts for some majority that might bring solace. Or decision.

Decision. Dissumulation, is all. Dissumulation. He doesn't care, despite his near ubiquitous plaudits. Well nor than does she. It was, she later realized, the codicil that granted her immunity from that death which awaited the other. She was phlegmatic no more.

Tuesday, December 9, 2008

The Donald

Donald Trump gets interviewed, pretty frequently, by cable news for sound bites on what's happening with the economy, business, banking, this kind of thing. You know, The Donald topics. Well I really do love these Donald moments. This is a guy who once ran for President, promoting himself to the country with memorabilia like "I've had a great time here" (what the heck does that mean? Is this what-happens-in-Vegas campaigning? Sounds like a Chicago politician, with perhaps less criminality and more just standard issue louche).

So he's back on cable news now, giving his wisdom on the auto bailout. The problem with The Donald is that he's so opulently wealthy, so New York Cosmo privileged, that he can't manage the relevant distinctions for us plebian viewers. When asked if he'd buy an American car, he responds that he's got several. In fact, he says, he just had one of his workers buy a Dodge Ram truck for him. He likes Cadillacs, too. Buick makes a good car, he assures us.

The interviewer, Greta Van Suster-whatever, asks if he buys foreign as well, and she should have known better, because (say it with me) of course he does! He buys them all. See, it's just about Donald Trump being really rich. But strangely I still like the guy. He embodies that rarified world of gaudy New York real estate tycoons. He's (weirdly) innocently just that. He tells us with a straight face that he helps the struggling economy by buying expensive things (he's got, strictly speaking, something of a point here.) No time for philanthropic B.S., The Donald's making deals. He just scored another gold statue for one of his homes. Beautiful. Great time to buy em'.

Final thought, The Donald was on I think MSNBC a while back, and was asked about buying real estate. I kid you not, he's on point with a story about how he just picked up a piece of real estate for pennies on the dollar. 112 million is all. It's a great time to buy, he concludes (as if we should all rush out and grab up some resorts in Miami for cheap). The interviewer is slightly exasperated at this point and reminds Mr. Trump that many Americans are struggling to buy a first home, or pay the mortage on an existing. Unfazed. The deals are out there, he says. And they are. If you're The Donald.

Monday, December 8, 2008

The Sea, Part Two

The undulations of the sea lofted him up, and rolled him helplessly down, into the chaotic brine and ocean. He choked. He choked more. His eyes went wide as adrenaline surged, his limbs flailing, then a massive push down into the sea. The spots that grew red in his consciousness grew brighter; his mind reeled and he was strangely cognitive, caught in frantic vascilating between drawing breaths of air or clenching shut to stem so much more choking. He didn't get it right. Water came into his lungs with fantastic pain and no air. No air. He drew in on himself. Where was he? Suddenly, and quit unexpectedly an insight. It should be fine, he thought, suprised at his calmness. Calm. That's what he was forgetting. Calm brought his dreams and his thoughts and then the painless morphinic images where he saw his father. His father was smiling, looking at him. God, his whole life washed over with calm. He was fine. He had always been fine.

The boulder that struck his head got no prize. He had ceased to care, and then he knew no more.

His death lay heavy on the hearts of the living. It was perplexing, and tragic. Perhaps. But perhaps in no one's view in particular, it could be said that his last moments were in fact very much like his first moments, and strangely just so innocent and divine. There was, of course, no one there to say it.

Ode to the 629

I have been traveling since Tuesday and hence no time for posting. But on my return there's my leetle friend, the Smith and Wesson Model 629 .44 Magnum (my wife picked it up from the gun shop while I was gone).

A few comments, in no particular order (actually a false statement; clearly there's a particular order, so hear it here first: all these years folks have been fibbing about the non-particularity of their particular orders).

1. It weighs as much as a kitchen pot half full of water.
2. The chambers are so large that I can get a good portion of my pinky finger into them.
3. Pointing it at yourself in the mirror gives you the heebee geebees (even, obviously, when verifiably unloaded).
4. It's barrel-heavy, with the 6" barrel. Tweaks the wrist a bit. The 6" barrel is I think shortest allowable for hunting. Moose. Or bear. It's Palin Friendly as we now say.
5. The grip is slightly small, although not unreasonably so.
6. It seems for all the world like a gun of such heft that one could make a plausible case that even absent ammo, it could still function as a formidable weapon.
7. My mother-in-law likes it. Really.

Tuesday, December 2, 2008

The Dry Wit of the Brit

Chistopher Hitchens, you gotta love this guy. He's on Hardball last night with Salon's Joan Walsh debating the Hilary Clinton pick, a perfect platform to launch his everything-but-the kitchen-sink diatribes against the Clintons (he thinks very little of the Clintons, I'm now aware). Ms. Walsh, who apparently was there as a cheerleader for Clinton (needed only the pom poms), and now has a look on her face like Mr. Hitchens is talking about her sister (the camera keeps panning over to her, what high drama), has just about enough and, without engaging the substance of his charges, lobs a Labowski at him: "that's just your opinion Christopher." To which he retorts "...yes, how clever, and look who's saying it? Would you rather I give your opinion?" Walsh just kind of harumphs, but Mathews couldn't resist a chuckle. Neither could I.

Sunday, November 30, 2008

Forgotten Thymos

We have, mainly, two great axes today upon which modern explanations of social phenomena turn. One, the economic/Marxist reduction; two, the religious/secular dichotomy, the latter a distinction which has carried such a load of late that it is in danger of spilling out and spoiling everyone's complacency.

A third analysis of why we do what we do comes from German philosopher Hegel, who plucked from Ancient Greece the tripartite conception of the person (reason, desire, and thymos). It is the thymos, he said, that drives history. Thymos is roughly the desire to be recognized. It's what makes us feel shame. Or pride. It is rooted in a conception of human freedom that is, at bottom, the notion that we are not understood completely by desires or by reason (roughly, by Marx or by Enlightenment secularism). The thymotic urge is a warriors urge, in us all, and finds its basic expression in the willingness to risks one's life to prove that one is free. Animals, when thirsty, just drink. People have the ability to delay desire, and they have the ultimate capacity to demonstrate that they are not animals by risking their lives (in modern times, we can cash this out as "dying for symbols of freedom", like medals, or honors).

Unfortunately I'm travelling tomorrow and have (true to form) hardly packed or even gotten my hotel room yet, so I can't burrow into this to the depth that it deserves.

Suffice it for now to say that the modern mind, trying to fit bad things around us into either an economically disadvantaged model (Marx) or a whacky-religious model (Enlightenment), is I think inadequate. What they want, and what makes them feel so alive, and so much better than the desiring "last men" who no longer care about honor but just material comforts before they too perish from the Earth, is recognition. How to change the debate such that they can have it, is surely worth talking about.


None, right now. I did purchase a Wired a while back, but in a section titled "Here are three things we got wrong, 1993-2008", they cited Fukuyama's The End of History. Among the more bone-headed claims I've seen in print lately, this gem stands out:

Francis Fukuyama proclaimed that history ended with the demise of the Soviet Union. The future would be characterized not by the literal but only the figurative war of ideas. We believed him.

We were wrong. Wired failed to see how a new generation of fanatical geeks would use the Internet in their effort to take over the world. Instead of ending, history looped back on itself, and we are now confronted by a recrudescent and particularly virulent religious ideology straight out of the Middle Ages.

We recognized a world in transition, but we missed the danger in front of us. We eschewed conventional wisdom, but we couldn't escape it. Takeaway: Be contrarian, and then be contrarian again.

Yes, that's the takeaway, you shiny paged, over glamorized pseudo-intellectual gee whiz nitwits. That's precisely not what Fukuyama said. I'm sure he's philosophical about getting misrepresented in the popular like-to-seem-smart media. If I were him I'd laugh or cry, but kindly suggest to the editor of Wired that they stick to GPS enabled toilet paper, laser sighted lipstick, and whatever other techy auto-erotica adorn the pages of this unfortunate 'zine.


The New York Times
The Wall Street Journal
The Austin American Statesman

Whatever shows up on my Yahoo page and in occasional targeted searches


Hast du mein Buch? Here they are for 2007-2008, going back about a year, not including Ph.D. research

Nonfiction I've read, in no particular order:

1. The Social Life of Information By Brown and Duguid (philosophy of information technology)
2. Deep Survival By Laurence Gonzalez (survival accounts with neuroscience analysis)
3. The Metaphysical Club by Louis Menand (historical account of pragmatism in America)
4. The Search By John Bartelle (biography of Google)
5. The End of Faith By Sam Harris (critique of historical religion)
6. The End of History by Francis Fukuyama (political philosophy)

Nonfiction I've read at least half through (and mostly plan to finish):

1. Bowling Alone by Robert Putnam (political science)
2. Free Culture by Lawrence Lessig (legal issues in technology)
3. A Good Hard Kick in the Ass by Rob Adams (some good hard boring business book)
4. The World is Curved by David Smick (economics)
5. The Necessity of Experience by Edward Reed (philosophy)

Nonfiction I've read before and returned to re-read large chunks again:

1. After Virtue by Alasdair MacIntyre (political philosophy)
2. Beyond Good and Evil by Nietzsche (who's Nietzsche?)
3. Consilience by Edward O. Wilson (philosophy of science)
4. The Mind Doesn't Work That Way by Jerry Fodor (philosophy of mind, Artificial Intelligence)
5. Nonzero by Robert Wright (philosophy of history)
6. Blown to Bits by Philip Evans and Thomas Wurster (business, technology)

Nonfiction I've started on, but haven't made much progress on yet (maybe <= 25%):

1. Personal Knowledge by Michael Polanyi (philosophy of science)
2. Making Globalization Work by Joseph Stiglitz (politics, economics)
3. Chances Are by Michael and Ellen Kaplan (probability theory)
4. Linked by Albert - Laszlo Barabasi (technology, science)

Nonfiction I've referenced pretty extensively:

1. Anarchy, State, and Utopia by Robert Nozick (political philosophy)
2. The Portable Nietzsche by Walter Kaufmann (who's Nietzsche?)
3. The Conscious Mind by David Chalmers (philosophy of mind)

Fiction I've read:

By Michael Crichton
1. Next
2. State of Fear
3. Airframe
4. Timeline
5. Prey
Started on:
6. Disclosure (won't likely finish)

Other fiction I've sampled:

7. Short Stories of Hemingway (read, notably, The Short Happy Life of Francis Macomber and The Snows of Kilimanjaro, and a few others)
8. Zen and the Art of Motorcycle Maintenance by Robert Pirsig (a reread)
9. War and Peace by Leo Tolstoy

Books I'm currently reading:

1. Churchill by Roy Jenkins (biography of Winston Churchill)
2. The World is Curved by David Smick (referenced above)
3. The Black Swan: The Impact of the Highly Improbable by Nassim Nicholas Taleb (just started)

Questionable Ethics

There's an interesting article in the Austin American Statesman today about a University of Texas at Austin astronomy professor, John Lacy, quitting work on a NASA project that seeks to "determine the chemical makeup of objects light years away." His problem? The instrument Lacy developed, along with a telescope, are afixed to a modified Boeing 747 that will be flying about four times a week between 8 - 12 hours each time, according to the article. It's too much pollution for Lacy.

It's true, those who travel by jet are serious Co2 emitters; scientists attribute "up to 3 percent of all carbon dioxide that contributes to global warming to plane engines." Planes are specifically troubling to GW proponents because their deposits of pollution in the upper atmosphere have a more severe warming effect. (To the rich, who've taken up GW as their cause: stop flying around telling us that the planet has a fever! What's next, I visit restaurants, chain-smoking, telling people about the horrors of second hand smoke?)

Anyway, Lacy's out. Of course he's been supported (publically) by other scientists, though curiously with luke warm language. Nicholas Veronico, head of the project, gave his obligatory plaudits to Lacy in the article, then pointed out that there's an average of 28,537 commercial flights handled by air traffic control each day. The effect, he says, will be minimal. That's an understatement, Mr. Veronico.

Lacy's argument seems a little like a scientist who refuses to perform research on developing better prosthetic limbs because it requires use of a certain non-degrading plastic compound. Sounds pretty ethically first-rate, perhaps, until one realizes that the compound is used in thousands of other projects and products, and the percentage of the total amount used for improving prosthetics is miniscule. Or, insert your favorite argument here.

As much as fellow scientists said kind things about Mr. Lacy, I think most reasonable people should be skeptical of the value of his decision. I hope I don't hear that he's flying home for Christmas any time soon.

Saturday, November 29, 2008

BCS Confusion

Texas Tech beats Texas in a Hail Mary pass to Crabtree with a second left. They then implode against the Sooners, and the surging Sooners now beat OSU in the "Bedlam" matchup in Stillwater, 61-41. Now, a three-way tie for the Big 12 Championship game. BCS rankings decide, with the Longhorns holding a slim advantage over the Sooners. Tech, who struggled against Baylor today, winning by 7 (Tech 35, Baylor 28), is probably out of contention for the national title, and with the Sooners victory in Stillwater they are definitively out of the Big 12 Championship.

So, the BCS rankings decide between Texas and Oklahoma. Although there is no rule against, it is nonetheless an unwritten "rule" that teams who fail to secure conference titles are not voted into the BCS title game. So, if the Sooners pull ahead of Texas later today, the Longhorns are likely out of the national title hunt. Unless, perhaps, the Sooners drop the Big 12 matchup to Mizzou. But don't count on it. If that unlikely event happened, my guess is that no team from the Big 12 South would go. Maybe then the winner of the SEC championship (Alabama or Florida) would end up matched against... USC?

Let's hope the voters keep in mind that Texas beat the Sooners this season, on neutral ground, 45-35. Hook em'.

Homicide Bombing (or, could you repeat that?)

I really wish Fox News would stop using "homicide bomber" to describe suicide bombings. For one, it's redundant. If we know that a bomber kills 12 in Iraq, we already know that there was a homicide. Just say "Bomber kills 12 ..." and be done with it. Two, the "suicide" modifier adds information about the nature of the bombing; it tells us that in fact the bomber was also and intentionally killed by his detonation of the bomb. That tells us something new. Fox, please, stop munging the language in a dreadfully obvious attempt to place emphasis on those killed by the bomber. No such linguistic machinations are required. I'm sure your viewers will not be tempted, even remotely, to sympathize with the bomber, at the expense of the innocent people brutally killed. You can go ahead and use the language correctly; it'll be all right.

Friday, November 28, 2008

A Human Life

It is possible, today, that someone willing to throw himself in harms way (or worse) to stop an early term abortion, might feel hardly a whisper of moral discomfort over the loss of over 4,000 men and women on the field of battle in Iraq. The emphasis, in the former case, is on "innocent", as in "an innocent life." It's of course true that a fetus is an innocent life. But the thought of men and women shipped off to a foreign land to confront not a grave, imminent danger but rather to execute an elective war decided by politicians, hardly creates a clear distinction between the two cases. Aren't these soldiers, in context, innocent too?

It's troubling to me that this concept, this abstraction of a "human life", with its inherent value, so dominates domestic discussions like abortion, while casualty numbers in foreign wars seem almost actuarial, as if "the human life" in battle is simply a discreet countable entity. Many of these men and women have been eviscerated in combat in ways that we could scarcely watch without vomiting, and can hardly bare to discuss even shielded from graphic details.

Of course, the 4,000 men and women who've perished wearing the American uniform in Iraq are heralded using the same language brought forth in all wars: valor, courage, patriot. It is no doubt true. But what animates and gives moral clarity to war is the notion of sacrifice for a worthy cause. Our troubling question today is whether in fact this standard has been met in Iraq. It is not for the soldiers to answer but their leaders.

There is, I think, something very noble about leaders who are willing to face true crisis with resolve. Winston Churchill, during the early days of the German invasion of Europe, when the fall of France was imminent, ordered French military ships destroyed to prevent their later use by Germany against Britons. It took gut-wrenching resolve. He was a great lover of France and a frequent visitor officially and in private, but he understood the stakes of the burgeoning German threat, and as immediate threat so often does, it trumped high-minded diplomacy. Later in Parliament, as if it were possible anyway to misunderstand his opposition to Hitler, he nonetheless left no doubt: "If this long island story of ours is to end at last, let it end only when each one of us lies choking in his own blood upon the ground." Gotcha.

And that I want of my leader. Give me WSC. But I do not want an elective war that bleeds away the very nobility and moral urgency of the grave conflicts that make heroes not just of our soldiers, but also of our leaders. America should find its soul again. Because, as we so often profess to believe, there is an inherent dignity to each human life.

Wednesday, November 26, 2008

The Terrorists

Muslim militants are at it again. Shocker.

101 people dead and counting. Here we go again. And in America, the conspicuous disappearance of any public discussion about terrorism for some time now. As if, it's a Bush topic. Somehow, some way, in our domestic self-absorption, I think we've managed to connect the unpopular President Bush with the idea of terrorism, as if, ridding ourselves of the one, the other will somehow go away, or just be a non-factor. But of course it won't go away, as we've seen tonight.

I confess to shedding Oprah Tears (or, I mean, getting stricken with Obamamania). But now the hard realities. Politic campaigns, Democrats back in power, none of this represents a solution; we haven't solved anything yet. Obama hasn't solved anything yet. He doesn't even have the job yet (yet somehow the world has seemed a safer, better place, hasn't it?). And what a job he will have. The militants in Mumbai, apparently, checked passports to round up the Americans and Brits. Nice to confirm that we're still the evil ones. And I'm sure we'll retain the title for some time, whether our new President has a Southern drawl or not. They don't care. President Elect can sound like Albert Einstein (actually, that might be a tad too tutonic, but you get the picture). He's still representing the Evil Nation, as far as extremists are concerned.

So now the blood on the streets, and the fires, and the frantic shouts are with us once again. And we hear once again that they were targeting the U.S. and Great Britain. It's I think safe to reject the notion that the Mumbai attackers didn't tune in to CNN recently enough to know that Barack Obama is our President Elect. They don't care.

Poor Rachel Maddow. She's been yuck yucking it up lately, feeling the Obamamania oh-so-much. And now to see her somber face, talking off of the teleprompter about terrorists targeting Americans, doing her level best to keep up the light-spirited Bush-is-a-dummy banter in the rest of her show. Damned breaking news. Yes, damned indeed. I do hope that I'm wrong. But those dark days of September 11 never left us. We just wished them away.

Erik's Too Much Time On My Hands List of B.S.

1. Bigfoot. Not real.
2. The Lochness Monster. Not real.
3. Alien abductions. Never happened. Ever. To anyone.
4. Aliens flying in spacecraft, visiting Earth. Also B.S.
5. This CEO salary thing. Successful Fortune 500 CEO, 10 million a year. Oprah Winfrey, 275 million a year. What the heck? Why isn't Oprah 27.5 times more evil? B.S.
6. Bill Maher. Joker. Rich joker.
7. The idea that the "war on terror" is over. What, we're so giddy in Obamaland that we think the Evil Doers have term limits? B.S.
8. GW. I mean "Great White", the 1980s rock band. For the fire in the nightclub deal.
9. Teatotallers. Unless you're someone who imbibes and ends up in a tree 50 miles from home, with no recollection of how you got there. And you're surrounded by cops. (Actually, I think this happened to me once. But it wasn't cops, it was midgets, or "little people". Or whatever.)
10. Certain aspects of holidays. Maybe, The Holiday Season.

Do NOT Write a Ph.D. Thesis

If you're thinking of doing it, don't. Three reasons:

1) Your project will be to write dry, important-sounding prose. That's basically it. This is so because you'll need to write something to please your committee, who of course had to write something that pleased their committees, and now demand the same from you.

2) You'll end up expatiating on and on, much more than you really need to make your argument (and you'll end up using words like "expatiate" rather than "ramble"). If you find, for instance, that you've made a convincing case with 120 pages, you'll end up going back to figure out ways to add 5 or 10 pages to each chapter. Why? Don't ask. That's the deal.

3) You'll be in real danger of thinking you're really smart when you're done, when you still don't know squat. You just jumped through all the hoops. Yes, of course, you'll know a heckuva lot about your subject, but your subject likely wasn't to solve world hunger, or figure out cold fusion. It was a little arcane pin prick of erudition that you drilled into in a kind of Colonel Kurtz-like fashion. You're just now an expert in that. Out in the world, most of what'll make you worth listening to will come from what you learned on your own. Because you read. And listen. And discuss.

So, as a public service announcement, please reconsider. Find some other way to feel accomplished. Write the great American novel, for instance. Someone might actually read it.

And, yes, I just finished a draft of my Ph.D. thesis. Mea Culpa.

The Puzzling Economy

What is the relation between economic theory and prosperity? It damn perplexes me. But I'm no economist. Still, consider: many of us are convinced that FDR's Keynesian approach helped pull us out of the Great Depression (but NYT luminary Paul Krugman has argued, recently that FDR botched it by not doing enough). And many of us are convinced that Reagan's supply side changes helped usher in the prosperity of the late 80s and 90s. Two radically different policies. So I ask: what's the real connection between economic theory and prosperity? I suspect it is tied very closely to circumstances. Very complicated circumstances.

On the extremes, things get easier to figure out: really bad economic policy is highly correlated with really low levels of prosperity. But somewhere in the middle (say, with regulated markets, like in the U.S.), it's hard to figure out what worked before, and pray tell, what will work again. As Nassim Nicholas Taleb suggests in his eminently readable "The Black Swan, The Impact of the Highly Improbable", maybe our best strategy is just to have smart, open-minded people keep tinkering in hopes that the right recipe will emerge. Trial and error. And when something finally works, we'll take credit for it, ex post facto, as if it was just obvious that X was the way (someone-very-close-to-me tells me that these Latin phrases are distracting and pedantic, like this long parenthetical, Godelian remark).

Anyway, such is the nature of things. We need to keep up the illusion that we're on it. And when the next big innovation comes, the markets will free up, and prosperity will (again) be ours. And a politician will take all of the credit.

Filling your Head

I have discovered that when I read the Wall Street Journal, I end up feeling worried about the state of things; when I read the New York Times, I end up feeling pretty good. In WSJ, Obama's health care plan will exacerbate a large (and growing) deficit incurred already by federal Medicare, Medicaid spending. The problem only gets worse! More debt! Heading for disaster!

In the New York Times, a highly competent team of economic experts have been selected by the sagacious yet pragmatic President Elect Obama, and when Bush gets out of the way (and get on it, says NYT columnist Gail Collins), we'll be on our way to sunnier days. It will be hard, sure, but with the dummy out of the way, we'll get there. Whew!

It occurred to me this morning that, a few years ago, maybe it was vice versa. Gloom and doom in the NYT, and talk about how the surge was working in the WSJ, and how the war was near to won. Anyway, I'm continuing this roller coaster of reading both. From elation to despair.

Tuesday, November 25, 2008

Wheat from Chaff

Okay, after getting sucked into the vortex of the damnable Global Warming debate, I’ll attempt a few additional points (see “Global Cooling” comments for the latest salvo of belief vs. skepticism).

One, by “Global Warming” (or just “GW”), I’m referring not to the obvious notion that the climate is changing but to the current view that the average measured temperature of the Earth will continue to rise to the point where we witness massive, even catastrophic changes in our climate. Now the “Al Gore” version of GW—which seems most popular—is the following. One, the ice caps melt. Two, the ocean levels rise. Three, human habitation (especially in coastal areas) is threatened by widespread flooding from the rising oceans. (Add into the mix other effects, such as an increase in extreme weather events.)

Okay, I’m sure the GW folks can add to this, or correct it in any number of ways they see fit. I think it’s a reasonable stab at the basic scenario that lights the we-must-do-something-now fire under GW proponents.

Now, there is I think a lot of confusion out there about science; to be more precise, about our ability to predict the future using the tools of science. We'll have to dig into some details to cash this out. First, the Enlightenment did indeed give us powerful predictive abilities that apply to what I'll call classical systems; systems that are not inherently complex. The standard classical examples involve the prediction of celestial events (when will the comet appear in the night sky, when will the shuttle approach the moon?, etc.). Our mathematics works really well for these types of systems. But in messier, complex systems (the so-called "nonlinear" systems that can't easily be modeled by differential equations, but enough of this), our predictive powers are positively paltry.

Not convinced? Consider: We can't give 7 day weather forecasts to save our backsides (in Central Texas, we can hardly get 2-3 day forecasts), we can't tell where the hurricane will make landfall (the predictive models give us 25 different trajectories, and whadya know, one of them was close!). We can't tell whether the Northeast will have heavy snow in 2009, and on and on. This is the dirty little secret about complex systems. We can't see into their future. And the global weather patterns of the Earth, folks, are a complex system.

So, at this point, the GW folks get red faced, and tell me that I'm screwing up the planet (sometimes they tell me that I'm almost ethically suspect, a sure sign that politics is wrapped too tightly around science). And here's my rejoinder: we don't know, and I suspect that you know that we don't know, but you're terrified that the knuckle draggers around you won't act quickly enough unless you make the issue seem dire, and requiring immediate attention. You trade the healthy skepticism about our predictive powers for some assurance that our kids won't inherit a world with Mel Gibson in it (I mean, in The Road Warrior). And so, over and over I hear: well, maybe we're wrong, but the cost of us not acting as if we're right is too great. It's Pascal's Wager (to which William James once remarked that God should throw out of heaven the Pascal's Wager believers first...).

And my response is: look, reducing carbon emissions is a classic case of over-determination (as philosophers like to put it). We all want clean air, clean water, parks to take our kids, a reduction in dangerous foreign oil, and so on. We have so many reasons to aggressively pursue alternative energy sources, who needs to leap headlong into (I think) suspect claims about our new found abilities to predict the future behavior of complex systems? Who needs the almost religious it-must-be-right fervor about some theory that is hotly debated and which, in spite of the GW rhetoric, is really still murky (I think). Let's get all the benefits of a cleaner future while maintaining a healthy skepticism about our ability to see into the future. Because, with complex systems, we can't.

So, sorry Al (Gore), but the world doesn't need more fear-mongering. We need less. And we can make the world a better place without it.

Sorry, Mr. Pickens, Loan Denied

One of the (many) unfortunate consequences of the Credit Crisis is its stalling effect on some very sensible alternative energy plans. The Pickens Plan, for one, which I support. The first phase of PP is wind energy; unfortunately Mr. Pickens, who wrote a book titled "The First Billion is the Hardest", now can't get financing for the project.

The New York Times reports also that Centrica, a British company, has halted expansion of its offshore wind farms project due to economic woes.

And with the price of crude plummeting, the sense of urgency for energy reform is all but gone. The OPEC yo-yo continues.

Global Cooling

Geophysicist Phil Chapman argues that the Earth is actually cooling.

Jens Bischof, author of "Ice Drift, Ocean Circulation And Climate", thinks we're due for a cold snap as well.

What the heck? Should we get sunscreen or parkas?

Monday, November 24, 2008

The Sea

People were scattered around the beach, laughing, drinking, warming themselves with daytime fires as he walked along the shore and into the rocks, around a bend, and then out of sight. The sun was beginning to set out across the expanse of the Pacific; the orange horizon bent with the curve of the Earth. It is beautiful, he thought.

The sea surged over partially submersed crags, charging forward quickly toward him, swirling with turbulence and a power that crashed onto the shoreline boulders he walked among, only to retreat again in swirls and froth into the depths. It is not safe, he thought. It is beautiful.

It washed up over his feet, barely, and he stood and breathed deeply. But the rocks, now slippery, failed him, and he slipped. He hadn't considered this; his only precaution was not to get too close, to avoid getting knocked down by the surging sea. But he just slipped. And into this monster of sea he went.

When the currents pulled him out, like flotsam, his chest heaved as he flung himself back toward shore. He thought to cry out but decided instead to swim. Swim. It was just panic. It was just panic. The shadows on the shore had grown larger and the thought of struggling in this immense leviathan of blackness approaching from above and below was too much. It was just panic.

When he reached one of the boulders, partially submerged, he felt lucky that it protruded for him momentarily above the surging water. He climbed onto it, heart pounding, gasping. Out of the sea he grew calmer, and then he almost laughed, realizing that he'd not (until that moment) thought of the horrific maw of large sharks coming up to him from beneath. He had been consumed with not drowning. A triumph of sorts. Moments later the sea came again, and his white bloodless hands slipped off of the surface as he was pulled again into its awesome power. It drove him violently forward toward the boulders, and helplessly he was flung onto the jagged rocks. But this did it. In this moment he grasped a protruding crag and clung fast as the sea turned back again in its endless cycle. He was alive.

His boots were gone. He walked, wet, shaken, barefoot, back to the beach. A large camp had stoked to life a bonfire, and grinning faces circled it, laughing, talking. He walked up, and smiled, and said hardly a word all that good, long night. It was beautiful.

Mr. Magoo

If it turned out that the entire edifice of human thought and indeed the fate of the free world depended on a guy named Mr. Magoo, would we still listen to him? I would. But it would bug me.


A little theory. About generalizations. On Monday.

Think of a "generalization" as a rule or conditional of the form "If X then Y", where X and Y can stand for everyday things like "If you need eggs then go to the store" or for scientific things like "If a fixed amount of gas is kept at a fixed temperature, then pressure and volume are inversely proportional". In the latter case we might also use the word "law", as in scientific laws.

A whole lot more can be said about generalizations, laws, and the like, but this admittedly cursory intro will do for present purposes.

Now, something we've discovered in the history of science is that the power of generalizations reduces drastically when applied to complex systems. In contrast, generalizations--say, the inverse square law in physics--have enormous predictive power when applied to very large systems where details can be ignored (to "the very large" as Hawking puts it). It's interesting that generalizations work wonderfully also for the really small; say, with quantum mechanical explanations of subatomic phenomena (where the generalizations are statistical in nature, but still general, powerful, and well-defined). But what's common to successful generalizations in either case is the lack of complexity in the systems in which they apply. Celestial mechanics ignores quite a lot. We want to know how long it will take for one body to orbit another, but we don't inject millions of other possible interactions (say, possible meteorites) to perform the calculations. Likewise, we isolate photons or other quantum phenomena in order to use quantum mechanics to predict outcomes.

Sure, classical mechanics -- Newtonian, and we can include Einstein's theories of relativity for these purposes-- are composed of really beautiful, powerful generalizations. So strange, then, that they are so irrelevant to prediction in everyday experience. The location at some time t + n given the location at time t of an entire planet is knowable given our classic theory. But something seemingly simple--the movements of a particular cubic inch of water in a mountain stream--is not. Why is the world like this?

We use other generalizations for predicting outcomes in complex systems. Mostly, however, we don't use laws but past experience. This is true of course with people (we rely on our knowledge of past events to make plausible inferences about future ones), and with computers, where models of complex systems invoke observed prior cases and relevant features (where "relevance" is added by the human) to generalize to likely future outcomes given unseen data. Laws don't do the predictive work in messy systems (we may assume, of course, that laws governing the relation between pressure, temperature, and so on all continue to apply in such systems nonetheless).

Humans, of course, make use of generalizations in everyday experience. They constitute "heuristics" or "rules of thumb". These are generalizations that no one expects will always apply. We know they admit of exceptions, but still they capture correlations between events of certain types that make them useful. Don't get into a car with a stranger, I tell my children, knowing full-well that there are scenarios where that is exactly what they should do (say, to save them from a maniac on the street).

But now things start to get messy. In everday experience, humans are what I call tightly coupled to changing circumstances--to facts--in a way that classical generalizations are expressely designed to avoid. We don't care about details of the celestial bodies when computing their trajectory through space. We care about some fixable features (their mass and velocity, mostly). We do care about these details when navigating through life. A person is a large object, and the slight raising of an eyebrow is a relatively small change in this object. But it might matter to someone (matter a lot), depending on some or other context.

So, this connection we have to changing circumstances implies that our cognitive or inferential abilities must be constantly, highly sensitive to details. This is, of course, exactly how it is with us. And an interesting consequence of this feature of everyday thinking is that our use of generalizations is circumscribed; they're not doing the inferential "heavy lifting" (what is?). Indeed, our stock of generalizations about the everyday world constantly become relevant, then irrelevant, and relevant again depending on context. (Some, like our belief that we won't float away into space, remain robust, though equally useless.) It's interesting that the world is like this. I think a whole lot follows from it, and I'll try to find some time to spell this out more later.

Sunday, November 23, 2008

The Happy Gadget

Years ago, scientists from several countries collaborated on a well-funded effort to engineer an "experience machine" (it later came to be known as the "Happy Gadget"). Ethicists around the world squawked, initially, in protest, then were silenced when public support grew, and politicians around the world endorsed what became known as "The Happy Revolution", or just "The Happy".

The Happy Gadget gave wings to generations of parents who wanted only to see their children be happy. In times past, the failure of offspring to make of their lives a success (to become "healthy, wealthy, and wise", as pre-Happys once remarked, steeped as they were in ignorance) was bemoaned by all. It caused great consternation, and by degrees, greased the axles for a new social theory and shortly thereafter the political will to usher in The Happy.

In 2015 the Happy Gadget was tested on a small group of children. Measurements of happiness were, in the words of one examiner, "through the roof". In a press statement released by Happy Industries shortly after the test, results read as follows: "...subjects experienced consistent, non-degrading feelings of happiness while connected to Happy [that is, Happy 0232i, the first Happy Gadget]. 90% reported wishing to stay connected to Happy indefinitely." [One outlier reported feelings of guilt, and later anxiety, at not having "done anything", in his words. He was released from further experimentation; his whereabouts now are not known.]

By 2025, Happy was the rage. Parents who, before, had "just wanted their kids to be happy", would report that levels of happiness were markedly increased; children, hooked to the Happy Gadget, self-reported happiness levels never seen before. By 2027, according to official reports at the time, 93% of all children under 16 were Happy. The project was vindicated, and proclaimed a success. It seemed a new era was at hand.

And then it happened. In the summer of 2029, they came to our Eastern shores, and by the winter of that same year the Great War was upon us. Soon after the standing government closed the Happy experiment. Parents, alas, had realized their professed dream of "just seeing their children be happy", only to see it come to naught. By the following summer, their society was in ruins. Wailing into the night sky, parents were captured on film repeating, endlessly, the Gadgets mantra that their children were finally happy. And later, even their clarion voices would be silenced. The Happy children were made to work, and were enslaved, and of the few documents published during this dark time, it was recorded that happiness fell to levels not ever seen before. It fell, in one definitive account, even below the pre-Happy levels.

It is today of pressing historical and scientific interest how the greatest project in the history of mankind, with the greatest, surely most noble goal--to make our kids happy--ended ignominously, and with decades of subjugation and war. No definitive theories are forthcoming; some small minority have questioned the Happy Gadget project itself, but no doubt with strong opposition, and a questionable command of the facts. These views, such as they are, are still voiced, though mostly dismissed. There is now no clear path ahead.

Saturday, November 22, 2008

The Warming Planet

The Northeastern United States is currently gripped with unseasonably cold weather. I'm sure the Global Warming converts will have their circumlocutions ready. It's complicated. But we're still dead right. Sure.

Not Clinging to Religion

I've always had a hard time with a literal intepretation of the many claims of the Old and New Testaments. The Old Testament is fraught with such difficulties that I'll set it aside for now. But believing the claims made in the Gospels is hardly an epistemic picnic either. "Jesus is God", that central tenet of Christianity, is just hard for me to believe. I'm no skeptic of the historical Jesus, but God? Really? And then there are the miracles (not just really rare events, but supernatural interventions in the physical world). The list goes on.

In addition to the basic problem of accepting these fantastical claims made in religious texts dating back thousands of years, there's a kind of convoluted quality to attempts by the early Church to makes sense of it all (and yet, it's now accepted doctrine, not amenable any more to re-interpretation). For instance, the Trinity. Do we really take seriously the three-in-one divinity idea? It just all seems so fishy.

For an insightful critique of organized religion try Sam Harris' The End of Faith. Even if you disagree, it's a fresh angle on the place of ancient religion in modern society, and it avoids the acerbic condescension of nit-wits like Bill Maher.

Finally, so I'm not misunderstood, and contra Harris, I'm not an atheist. Also, I'm not against the Church, whatever that would mean. I'd be horrified if religious faith ever "ended" in Harris' sense. I like the insights in his critique, but just not his conclusion. For one, organized religion provides much non-government public benefit; it's a strong "mediating institution" that stands between the State and individuals. Two, it's community-based, in the "Bowling Alone" sense that it discourages the slide toward selfish individualism. It brings people together. Three, the concept of the Divine, whatever the doctrinal details, is I think right-headed and, for lack of a better word, inspirational. I've never understood how Carl Sagan types find such beauty and purpose in matter and energy. I'm no materialist.

Okay, I think that's enough for now. I'll leave things with a few anecdotes from our political past. One, our own Benjamin Franklin, who was a lifelong supporter of church, was friendly to and friends with clergy and all things ecclesiastical, but nonetheless would spend Sunday "catching up on work" rather than in the pew. Two, the great Winston Churchill, who helped save the free world from the tyranny of fascism, once famously remarked that he was not so much a pillar of the church but a buttress, supporting it from the outside. Now that makes good sense to me.

OU Tech game

OU's O Line steps up, Bradford shows that he's still in the Heisman hunt, and the defense finds an answer to Crabtree. OU by 10.

Wednesday, November 19, 2008

Clinging to Guns

I plan to purchase a Smith & Wesson Model 629 .44 Magnum revolver with a 6 inch barrel. It's the Dirty Harry gun. Why? Something of romantic overkill, sure, but also self-defense, the much maligned concept that nonetheless persists with so many as practical and, well, human.

Classic scenario, if not over-used: when I'm travelling, my wife is in the home alone with our two children (8, 7 years old). We have a firearm, currently, in the home, which I taught her to use in the unthinkable situation where someone attempted to assault them. Now, living in the country, I can estimate (without claiming to precision), that it would take the local police force several minutes, and likely, say, upwards of ten minutes to respond to a 911 call. In that time, as I'm sure we can all imagine, so many horrors could be visited upon my wife and kids, that I'd rather not continue.

It works like this: people who live in major cities, see major city problems with guns. For those of us who live in the suburbs (read: in the country), the thought of having responsible access to deadly force is not problematic but comforting. I'm sure my neighbors have firearms as well. Bully for them.

It's difficult to see how the State could argue our comfort away; on what grounds? I, for instance, am not a felon, have handled firearms since a kid, taken gun safety courses, hunted my entire life, and on and on (same too, mostly, with my spouse). And so, the argument to deny me a firearm is...?

As to inner-city gun violence, I'm well aware. My point isn't a blanket argument about guns but rather a question of who should get them. I'm inclined to make the law strict in this regard. Apply for your firearm. If you qualify, wait a period, receive it. I'm not opposed to this.

Shifting gears, on a purely theoretical level, it's incomprehensible to me that somehow our relationship with the State should be such that the State controls, exclusively, the means of deadly force, and denies categorically the same to its citizens. Again, if organized law enforcement can't be everywhere, all the time, why shouldn't well-meaning citizens provide for their own protection? We are not, after all, children. The law provides (indeed, it's in the Constitution, not by accident) for citizens to have access to firearms. We're not children.

Finally (to munge many points together here), "self defense" for me includes also backpacking trips in Montana, Idaho, Washington, Wyoming, where the chances of encountering a brown or black bear are non-trivial. Self-defense. I'm not interested in testing theories about playing dead; I'd rather my kids see me return home. Hence, a large-caliber handgun.

Tuesday, November 18, 2008

Inconvenient Truths

What happened to Global Warming? To the planet "having a fever", as Gore in his arguably more inconvenient than truthful oratory impressed upon us? Apparently we have bigger fish to fry now (with, of course, non-carbon emitting heat sources). The economic meltdown, for instance. More powerful than a war-on-terror, which, apparently, wasn't enough to keep the feverish planet out of the lime light. Not so the Credit Crisis.

So, perhaps, it's a good time to revisit what I've long suspected is an entirely suspect political/scientific cause du jour.

Plenty of facts to bore everyone to tears, but let me instead explain the type of argument that Global Warming seems to inspire. It goes a little like this...

First, we have scientific data. We have charts. We make the case. But next, inconveniently, some really smart people (e.g., professors at MIT) disagree. They have different data, or see the existing data differently, and draw, with MIT rigorousness, entirely non-Global Warming conclusions.

Now, when confronted with these nay sayers--and for no apparent political reason do they nay say (unless, perhaps, they wish to disintegrate their careers, for some unknown future benefit)-- the GW crowd goes democratic: "Well, most, indeed nearly all, of the scientists cut the political cake our way. There's consensus. We voted, and we won." More scientists said what they wanted them to say.

That's fine, but science is hardly a show of hands. Its evidence based. Just one well-informed person (expert) with something to say, ought to be heard, whether contradicting the bandwagon or not. Maybe he'll get less research money. But, if he or she is doing science, and if GW is so obviously right, it ought to be just easy then to rebut the dissenting view. Just prove the irritating dissenter wrong, if the theory is so obviously right. No need to vote. That sounds like politics. We didn't vote on General Relativity, after all. It just is, as any physicist will tell you.

The GW debate is so far from over, it just astounds me that its near-religious proponents want so desperately to slam the lid shut. I've heard mamby-pamby justifications for this non-scientific attitude like "Well, even if we're wrong, and there's not imminent catastrophic climate change, we should do something about our emissions." Yes, yes, yes. Of course. We should also import less foreign oil. Sounds like a win-win to me too, but lets do science as science. It's not pragmatism or policy, after all; it's supposed to be the search for truth. Let's find it, if we can.

At any rate, and to return, the Gore-scare seems to be largely over, for now, replaced by the Credit-scare. And so, if something needed to be done now to fix Global Warming (I mean Gore-now, right now!), we're either completely insane (worrying about this credit crisis crap), or less susceptible to b.s. than some would hope. An inconvenient truth, perhaps.

Monday, November 17, 2008

Running Stop Lights

I was driving down I think it was Lakeline a few weeks ago, and noticed a camera pointing down at me at an intersection. I stopped a little past the line, listening to music and I suppose not doing my best driving. At any rate, I had some spell of worry after driving away, as if, somehow, the technology would record not my running the light, but my stopping past the line. After a while I recovered from my Orwellian paranoia; glad to say I haven't received a ticket from this camera yet.

Later I talked with a friend about the privacy issue, and in particular the issue of whether cameras should catch red light runners regardless of whether any human (police officer) witnessed the event. My friend took a reasonable position, saying in effect that, look, you've run the light or you haven't. Why require that a cop be present at every light? Why not use technology, if it works? After all, we don't have existential angst about police having well-calibrated radar guns to catch speeders. Who argues with this?

It was a reasonable point, I suppose. But, after puzzling about it for a while, I smelled a rat. My question was simple: suppose we had a technology that was even better than cameras, or radar guns. It represented the completion of all such technologies. It rode along with us, and, if you broke a law, it would simply record the infraction, and send the ticket. So, if you run a stop light, you get a ticket. If speeding, a ticket. If you happen to be risking more serious infractions by driving after having too many, you'll be apprehended later, sure as the sun rises, and will spend the night in jail. And on and on. The perfect completion--just the logical extension--of cameras at stop lights. Any one for this?

My friend recoiled at the suggestion. I asked why, and after some mumbling and fumbling, just declared that it wasn't right. Right.

What's interesting is how, seemingly, more security makes such good sense, to a point, and then suddenly it seems positively distopian. We're all for better security when it doesn't feel like the State sucks away our autonomy. But, as my thought experiment suggests, once we do feel this way, we recoil.

It's worth thinking about the distinction between the two cases. Just decent (not omniscient) technology like cameras at stoplights sounds great. But better and better technology, and suddently we're holed up in Montana with guns. So where does the issue really stand? I wonder. Flipping the coin, consider it this way. Why should we be allowed to break the law, ever, and get away with it, just for lack of technology? We still broke the law. Why, then, recoil when law enforcement just works better?

Saturday, November 15, 2008

Camille Paglia

Philosopher Edward Feser writes about libertarianism, and in particular its philosophical compatibility with Conservativism. It's a little dated now (2001), but still contains references to celebrity pseudo-intellectual libertines like Bill Maher. I wasn't shocked to see Maher depicted by Feser as a "self-styled 'libertarian'" celebrity "whose libertarianism amounts to little more than an enthusiasm for legalized abortion and homosexual chic...", but was suprised to see Salon contributor Camille Paglia on Feser's list (that is, of people we should not take too seriously when discussing libertarianism, and I think safe to say political philosophy generally).

Maher is, after all, a clever and cleverly misogynistic "shock" celeb with enough intellect to be dangerous; he never manages (or, I imagine, cares) to scrape too far below the thin veneer of sardonic humour while somehow staying in the vicinity of an actual point.

But Paglia? I keep seeing her on Salon (which, in spite of some invective I'll occasionaly throw its way, I will read). Her latest is some longish article with a section on why she likes Palin. For someone who professes to voting twice for President Clinton, her latest is quite a departure, it seems. Among the juicier tidbits: Pro Life feminists like Palin (does she mean Pro Life females?) will "be the next big shift in feminism", and "So she doesn't speak the King's English -- big whoop!".

I know very little about Paglia, other than she's been lampooned on the Internet as a light-weight masquerading as an intellect, and that Feser placed her on a list with Maher as examples of how-not-to-be a serious Libertarian (of Conservative fusionist stripes or not).

Salon, at any rate, must think something of her.

Hold the Phone

For anyone who remembers the foundering hopelessness of Democrats after President Bush's reelection in 2004, after the simultaneously stentorian and soporific John Kerry failed to deliver, we'd all be wise to take a chill pill before expatiating on the death of the Republican party.

American politics is cyclical, like democratic politics generally; back and forth between our two parties we always go. No need to dig deep into history tomes to get this, just read Wikipedia entries for a few minutes about past elections and it'll become clear. The elephant will return; I hope thoroughly modernized and with a clear, inclusive message that resonates and especially inspires.

At any rate, even if President-elect Obama is a two-termer (of course way too early to prognosticate about this), it's hardly that long in the wilderness for those roaming Dinosaur Republicans, when put in historical context. So, enjoy the moment, Salon. But chill. You don't want some new era of journalistic gloaters to dig up those now-embarassing backlogged 2008 articles, injecting them with glee into future discussions about Republican reemergence. Back and forth we go.

Friday, November 14, 2008

Man on Main Street

Sometime ago, I was on Main at dusk, standing facing a man I'd just met, smoking a fag. He was engaged about local political issues, which I knew little about at the time, but he was quite amicable, and before long I think we were mutually gay. I asked for one of his smokes (looked a bit like a cigar), and received first a queer look, then my (perhaps dubious) prize. I suppose his confusion stemmed from my prior proclamation of quiting; he laughed when I offered Mark Twain's famous quip that quitting was easy, as he'd done it at least a thousand times. At any rate, I'm sure he wanted to avoid appearing niggardly, with our chat going along so well.

Later, my new bitch trotted up to us, a Labrador. The man petted her briefly, then turned away to gawk, with me, at a passing jackass; perhaps it was a mule. It footed along the road in front of us, plodding a bit with the weight of several bulky burlap sacks. Strange.

We puzzled together at this for a few minutes, when a dick accosted us, pencil and paper in hand. He wanted to know about the jackass; apparently the owner suspected it had been let loose deliberately. After a curt and somewhat painful exchange the man jerked his thumb up Main Street, and the dick thanked us perfunctorily and went to apprehend the wayward ass.

I was, at this point, quite pooped, and so thanked the man with whom I had had such a gay time (in spite of the dick), and we parted ways. I saw the man just on one other occasion, standing out on Main, smoking a fag, animated in discussion about some island called Bali. I passed by without saying hello, but overheard some of his descriptive ejaculations, in particular his frequent use of the word "cock", embedded, it seemed, in a larger narrative about fighting. Balinese cock fighting, I later learned. What a man. A learned and I think good man. Queer, but smart, and gay as hell. Here's to you, man on Main Street. I hope someday we'll cross paths again.

Thursday, November 13, 2008

The Problem of Eric Rudolph

The supermax incarcerated abortion clinic bomber, in his words:

"I'm here today to be sentenced for my actions on January 29, 1998. On that date I detonated a bomb at an abortion mill here in Birmingham, killing the abortion mill's security guard and injuring one of the abortion mill's employees. I had nothing personal against either of these individuals, Sanderson and Lyons. I did not target them for who they were - but for what they did. What they did was participate in the murder and dismemberment of upwards of 50 children a week."

"My actions that day were motivated by my recognition that abortion is murder. Because it is murder, I believe that deadly force is indeed justified in an attempt to stop it. I do not claim this as a right but rather consider it the moral duty to come to the defense of my fellow man when he is under attack. This is an essential concept embedded in Western Civilization - that we are our brother's keeper."

Crazy, of course. But from a philosophical standpoint, troubling. The problem is that his logic makes a certain sense, given the assumption that life begins at conception. Of course his violence is reprehensible, but it's a troubling case, because his words are bluntly consistent, given a literal interpretation of life at conception. I think it's his lunacy that ironically helps us search for sanity in the abortion debate. We'll find the lunacy on the Pro-Choice side as well.

So here goes, let's jump into abortion...

If life really begins at conception, isn't then abortion really murder? Think it through. Pro Life proponents say it's just so: life begins at conception. But they don't really believe this, because even hard core Pro Life types tend to make exceptions (e.g., for victims of rape). But if it's murder, it's murder, whether the life began with an act such as rape or not. The rhetoric isn't consistent with the policy. This is what I call the strange logic to Eric Rudolph, his (superficial) clarity. They kill defenseless persons. I kill them. He took the message on face value. Bully for consistency. But you're still crazy, Mr. Rudolph. We don't, in fact, treat a woman having an early stage abortion on a par with someone committing infanticide, or just killing someone else in the world. So something's gone wrong. Rudolph was wrong.

Now Pro-Choice types have their own problem: if it's just all about a woman's right to choose, if the life or possible life of the fetus is trumped by a right to privacy (poof!, our Roe V. Wade result...), then why not partial birth abortions? Why get weak-kneed in the third trimester? It's easy (and suspect) to say just that the right to privacy diminishes. Why? Because the cells in your body gained more mass? But why should this matter? Sounds like losing your nerve, mother-to-almost-be. Be strong. Privacy. Consistency. Eric-Rudolph-it.

Think of it this way, what's really the difference between an organism (I use this "person neutral" language just to get at the point) that exists, one moment, in its mother, and the next minute, in the world? A few minutes? That's the difference? What wonderous changes could have happened through the birth canal? C'mon. This too makes hardly any sense at all.

So, Pro Choicers supportive of third term procedures must, somehow, ignore weighty considerations of life, of human life, in their bodies. Strange that otherwise normal people can manage this particular moral magic trick. A judicially created right to privacy seems much too flimsy, obviously too flimsly, to cancel the empirically obvious fact of a human life. And their logic, like Mr. Rudolph's, leads them quickly into troubling waters. If they see no problem with third trimester abortions (especially in cases where the fetus could live outside the womb, if delivered), they ought not to balk at infanticide, also. Because, again, what's the difference? If the baby is alive now, the fetus was alive two minutes ago. (Running the argument the other way, if the fetus had no right to life a few minutes ago, how did the baby suddenly acquire it?)

Of course, the "infanticide" result is crazy too, just like Rudolph's "abortion is murder" interpretation. So we've never figured out this debate. It's horribly incomprehensible, when consistent, on both sides. Best left inconsistent? Hardly comforting.