29 October 2008

Illogically InQuined

I've now given my Theory of Knowledge presentation. I think it went well. The other students seemed to think I did too good a job, thereby setting the bar a bit higher than they wanted. Mine was the first student presentation. Amusingly, the other students semi-pushed me into going first as I'm the only one registered in the class as a grad student. Not a grad student in philosophy, admittedly, but I just shrugged and went for it. I read W. V. Quine's article Two Dogmas of Empiricism and presented an overview of it.

A bit of background. There's been a fairly consistent dichotomy in theory of knowledge between what Hume called "truths of reason" and "truths of fact." There have been different names applied to these, and some will argue that the different names don't always describe exactly the same ideas. Nonetheless, they can be thought of as two broad categories. In one category are the things that we know from observation and experiment (synthetic). In the other are the things that we know independently of observations and experiment (analytic). How we know those things is also a matter of debate.

Quine argues, essentially, that this is a false dichotomy. He sees it more as a continuous variation rather than an either/or. Given the meaning of the logical particles, he would agree that "all bachelors are bachelors" would be true independently of the meaning of "bachelors," but not that "all unmarried men are bachelors" is equally clear. It depends on "synonymy", and synonymy depends on definitions, which are determined by usage not by the words themselves. So Quine would argue that there's an empirical component any time a truth depends on what a word means (when that word is not a particular logical particle with a well-defined function).

He doesn't think that paring down to a simplified language lacking in synonyms makes things any better. Simplified languages typically start with a list of "semantic rules," and these rules tell us which statements of the language will be analytic, but get us no closer to figuring out what it is in a general language that makes a statement analytic. In fact, it's circular to try and use the simplified language, as the semantic rules are usually constructed so that they will preserve "analyticity" of statements between the languages. So you need to know which statements of the original language are analytic even to construct the simplified language.

Quine's last attempt to decipher the analytic is to apply "verification theory" to it. Verification theory is the idea that a statement's actual meaning is in the method that one would use to verify it. So the meaning of "the mail has come" is that I go out and look in the mailbox to see if I'm right or not. However, this seems to presuppose the analytic/synthetic distinction, in that the analytic statements, presumably, are not in need of empirical verification. Or are they? Any that depend on "synonymy", according to Quine, are in need of empirical verification.

The problem is that there is no particular reason that "king" means "male who is in charge of a certain area of land and all within that land" instead of meaning, for instance, "miter saw." That "king" has the meaning that it does is an "accidental matter of fact." And word usage changes constantly. Quine will eventually conclude that the language itself is the unit of meaning (raising issues of translation, and of how it's even possible for two people to communicate in the same language), but I think that the lesser point is a valid one. At one level, within the context of a language semi-frozen at a moment in time, there will be "analytic" statements, but they do depend on both the language and the time-frame. At another level, once the ideas behind the words are understood, the logical relations will hold regardless of the words, but there's no reason to think that the logical structure is fixed, either. There are multiple logics that are useful in different instances. For one example, see Quantum Logic.

No comments: