A little theory. About generalizations. On Monday.
Think of a "generalization" as a rule or conditional of the form "If X then Y", where X and Y can stand for everyday things like "If you need eggs then go to the store" or for scientific things like "If a fixed amount of gas is kept at a fixed temperature, then pressure and volume are inversely proportional". In the latter case we might also use the word "law", as in scientific laws.
A whole lot more can be said about generalizations, laws, and the like, but this admittedly cursory intro will do for present purposes.
Now, something we've discovered in the history of science is that the power of generalizations reduces drastically when applied to complex systems. In contrast, generalizations--say, the inverse square law in physics--have enormous predictive power when applied to very large systems where details can be ignored (to "the very large" as Hawking puts it). It's interesting that generalizations work wonderfully also for the really small; say, with quantum mechanical explanations of subatomic phenomena (where the generalizations are statistical in nature, but still general, powerful, and well-defined). But what's common to successful generalizations in either case is the lack of complexity in the systems in which they apply. Celestial mechanics ignores quite a lot. We want to know how long it will take for one body to orbit another, but we don't inject millions of other possible interactions (say, possible meteorites) to perform the calculations. Likewise, we isolate photons or other quantum phenomena in order to use quantum mechanics to predict outcomes.
Sure, classical mechanics -- Newtonian, and we can include Einstein's theories of relativity for these purposes-- are composed of really beautiful, powerful generalizations. So strange, then, that they are so irrelevant to prediction in everyday experience. The location at some time t + n given the location at time t of an entire planet is knowable given our classic theory. But something seemingly simple--the movements of a particular cubic inch of water in a mountain stream--is not. Why is the world like this?
We use other generalizations for predicting outcomes in complex systems. Mostly, however, we don't use laws but past experience. This is true of course with people (we rely on our knowledge of past events to make plausible inferences about future ones), and with computers, where models of complex systems invoke observed prior cases and relevant features (where "relevance" is added by the human) to generalize to likely future outcomes given unseen data. Laws don't do the predictive work in messy systems (we may assume, of course, that laws governing the relation between pressure, temperature, and so on all continue to apply in such systems nonetheless).
Humans, of course, make use of generalizations in everyday experience. They constitute "heuristics" or "rules of thumb". These are generalizations that no one expects will always apply. We know they admit of exceptions, but still they capture correlations between events of certain types that make them useful. Don't get into a car with a stranger, I tell my children, knowing full-well that there are scenarios where that is exactly what they should do (say, to save them from a maniac on the street).
But now things start to get messy. In everday experience, humans are what I call tightly coupled to changing circumstances--to facts--in a way that classical generalizations are expressely designed to avoid. We don't care about details of the celestial bodies when computing their trajectory through space. We care about some fixable features (their mass and velocity, mostly). We do care about these details when navigating through life. A person is a large object, and the slight raising of an eyebrow is a relatively small change in this object. But it might matter to someone (matter a lot), depending on some or other context.
So, this connection we have to changing circumstances implies that our cognitive or inferential abilities must be constantly, highly sensitive to details. This is, of course, exactly how it is with us. And an interesting consequence of this feature of everyday thinking is that our use of generalizations is circumscribed; they're not doing the inferential "heavy lifting" (what is?). Indeed, our stock of generalizations about the everyday world constantly become relevant, then irrelevant, and relevant again depending on context. (Some, like our belief that we won't float away into space, remain robust, though equally useless.) It's interesting that the world is like this. I think a whole lot follows from it, and I'll try to find some time to spell this out more later.
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment