Back to Part I

My apologies for dropping out of this discussion here - it deserved more time than I could give it 'til now, and Grim gave me a not-at-all easy reference to look over - which I quite failed to grasp. (I've read Part II and am joining in that one separately.) I want to return to a part of Part I. Grim was reexplaining Kant's problem in terms of a believer, like Chesterton, who claimed to have pieced together evidence from throughout his life that brought him to believe in God.
Let's say that someone has encountered a number of phenomena that they believe demonstrate the existence of God. One counterargument to their reasoned belief in God would be to point out that they have misconstrued the causes of the phenomena...Yet he has come by his knowledge in the same way we come by knowledge of anything that is outside of ourselves in the world.
Absolutely, boss. But the quality of that evidence is the thing I always want to examine. (Chesterton makes it impossible because, after a book of build-up, he won't even say what that evidence is. But that is another story.) Putting it that way blurs the distinction between evidence of different quality (per Chesterton again, between the kind of man who doubts the existence of God and the kind who doubts the existence of cows).
So your objection, and Tom's after a fashion, is that you want to say that 'well, we can't have perfect knowledge of things outside of us, but we can have approximate knowledge' -- knowledge on a scale, as Tom put it. The problem is that doesn't get off the ground. Everything you think you know about the outside world is phenomenal (Kant is arguing). Every experience, every sensation, every fact you think you know is actually just a fact about your own internal thoughts...
Not so. The perceptions I get are evidence about the external world. "Direct" in the legal sense; "indirect" the way you say Kant's using it. The things I experience are consistent in such a way that they back each other up, and are evidence for each other. I see what looks like a fire; I feel the heat from it; I touch it and get burned by it; I hear and read about it. This is all evidence that such a thing as fire exists. It would be different if I lived in a world where I saw things that looked solid, but my hand passed through them when I tried to touch them; or things that looked just like fire sometimes burned and sometimes didn't for no apparent reason; or I felt my skin was crawling with bugs but everyone else said I was suffering from delusional parasitosis. Those situations would be evidence that my senses were not reliable and that the knowledge I got from them was not so useful.

I'm not in a world like that. The evidence I get runs the other way - within limits.[1] Yes it is possible that this is all a great self-consistent illusion of the brain-in-vat variety. But, I have to say, so what? What difference does this make to anything I have to do? Why paralyze myself by claiming, "This evidence isn't perfect; it could be all wrong without my knowing, so I'll declare all my knowledge completely nonexistent, without value, not knowledge at all?" It's the only evidence I've got and I'll take it as far as it seems to get me. Any map that I carry is not the same thing as the land it represents. It's only an indirect representation, and by its nature imperfect. Do I throw it away? Declare it's no map at all?

Stephen Donaldson's Thomas Covenant - the protagonist of a few trilogies he wrote - was in a similar situation. He kept being transported to a fantasy world, which included a villain named Lord Foul, and (at least in the first two I, which I read) never seemed certain whether he was really visiting another world or dreaming the whole thing. But, he figured, whichever way it was - he was going to fight Lord Foul. I don't understand any other approach.

[1] I'm partly color-blind and accept there are things you can see that I can't; I can be fooled by optical illusions and know that what I think I'm seeing isn't always quite right.

Weren't We Just Talking About This?

A biologist writes about culture in terms that will seem quite familiar:
RICH AND SEEMINGLY BOUNDLESS as the creative arts seem to be, each is filtered through the narrow biological channels of human cognition. Our sensory world, what we can learn unaided about reality external to our bodies, is pitifully small.
He gives a litany of examples, to which we might add:  and all that's without the problems of apperception.

More on Bullies, From President Obama

Well, then-aspiring-author Obama, rather.  Dr. Althouse is reading more around the dog-eating tract that has gotten so much attention.
The man pulled the blade across the bird’s neck in a single smooth motion. Blood shot out in a long, crimson ribbon. The man stood up, holding the bird far away from his body, and suddenly tossed it high into the air. It landed with a thud, then struggled to its feet, its head lolling grotesquely against its side, its legs pumping wildly in a wide, wobbly circle. I watched as the circle grew smaller, the blood trickling down to a gurgle, until finally the bird collapsed, lifeless....  Later, lying alone beneath a mosquito net canopy, I listened to the crickets chirp under the moonlight and remembered the last twitch of life that I’d witnessed a few hours before. I could barely believe my good fortune.
Why does the boy — as remembered by the man — connect the killing of the bird to his own good fortune? Is it some elemental realization that simply to be alive is amazing, the bird being dead? Or is he excited to be in this new place with lots of thrilling new activities like beheading a bird and shortly thereafter eating it? Or is it the connection to the father figure, who's so eager to show the boy what life is really about and so easily overcomes the reticence of the mother? The next thing that happens in the book is that Lolo teaches him how to deal with bullies: Don't cry over the lump where he hit you with a rock; learn boxing. Lolo buys boxing gloves for him and teaches him to "keep moving, but always stay low—don’t give them a target." Good advice!  And it's on the very next page that Lolo teaches him to eat dog (and snake) meat.... The point is:  Life was a big adventure. And meat was part of the adventure — meat from real animals that lived and died.
Doubtless the President would agree with our advice, then:  the way to deal with bullies is to teach them to fear your own strength, not to whine, and to learn to fight smarter and better than they do.

Also, never to believe in clean hands.

Not your grandfather's DNA

If some science fiction writer doesn't pick up on this idea for a story about really alien forms of life, he's missing a good bet.

All life we know of on Earth depends on RNA or DNA, the long, ladder-like molecules that hold the sequences of three-letter words (each spelled with the four-base alphabet A-G-C-T) that serve as code for the 20 amino-acid building blocks of our proteins.  No one knows how such a code developed in the first place, or why all earthly life uses essentially the same code.  No one knows why all life strings the code along a ladder built of the sugar called ribose (the R in RNA) or its slightly altered cousin, deoxyribose (the D in DNA).  Was it just the structure that fell into place first, like the QWERTY keyboard, and everyone kept using it from then on?  There's no obvious reason why the ladder couldn't be built of other sugars.  For that matter, there's no obvious reason why the alphabet employed by our genetic code for protein synthesis couldn't choose other letters from the unknown number of potential nucleotide bases of which our familiar A, G, C, and T comprise only four (five, if you count the U that substitutes for T in RNA).  Further, there's no obvious reason why the code should limit itself to three-letter words and resulting vocabulary of 4 to the third power, or 64 words.

Obviously a 64-word vocabulary is sufficient to spell out some amazing complexity.  There's no limit in principle to the sentences you can form with 64 words.  You don't even need four letters to spell a lot of words, as binary computers attest.  The point is, the code our DNA uses is not the only way to skin a cat.  For instance, the amino acids that line up like pearls on a string to form our long-chain protein molecules number 22 in total, but of those only 20 are assigned a three-letter "code word" in our DNA.  (The other two get added in by separate enzymes at a later stage in the protein synthesis process not directly controlled by a DNA transcription.)  What's more, a number of other amino acids have specialized uses other than as beads in the protein string, such as for neurotransmitters or steps in metabolic pathways, but they are not assigned a three-letter DNA code word.  So our genetic code has more words than it needs for the amino acids we use, but it uses up the redundancy in synonyms for some of them, while having no word for others.

So, back to the article I linked to above.  The guys who fool around with this stuff are beginning to synthesize genetic molecules they call "XNA." These are still identifiable as nucleic acids, using a sugar and a phosphate for the ladder backbone and the familiar bases A, G, C, and T for the rungs, but they use different sugars from the usual ribose or D-ribose.  Some of the alternative sugars turn out to be more structurally sound, standing up unusually well, for instance, under the stress of voracious enzymes and extreme pH levels.

Intrepid experimenters are even adding a couple of new bases to the usual A-G-C-T quartet, thus vastly expanding the code's vocabulary.  I'll be very interested to learn how (and if) the surrounding cell mechanism learns to "read" the new words.  I've never quite been able to understand even how the old words are read.  Some sources I've read suggest that the shapes of the A-G-C-T code words are in some way physical cookie-cutter templates for the corresponding amino acids, but my impression is that that part is not well understood, and in any event I certainly don't understand it.  It's one of my favorite mysteries.

All this work still sticks pretty close to Earth-style genetic molecules, of course, using a sugar-phosphate ladder backbone with bases for rungs.  And yet sugars surely aren't the only way to construct a ladder, nor ladders the only possible structure on which to string a series of letters, nor a linear string of letters the only way to express and preserve a code.  What works here needn't be what works best under different conditions.  So I'm really curious to see how experiments in synthetic genetics come out.  Because of my abiding interest in the origins of life, I'd love to find out more about how ordinary molecules could possibly have developed into active metabolism from dead-end equilibrium, and from there into replicating systems that take resources from the outside world and use energy to restructure them according to their own pattern.  If nothing else, I'd like to see a better understanding develop of what kind of proto-molecules could possibly have developed into RNA, which, as primitive as it may be, is still an extremely complex structure and very, very far removed from the kind of chemical gunk you can generate from experiments designed to mimic primordial conditions.

I often hear casual statements to the effect that conditions on such-and-such a planet are "too extreme" to support life.  I don't find a statement like that meaningful.  Even on Earth in recent decades, we've found microbes thriving in extremely hot, cold, or poisonous conditions we'd confidently have called impossible until they were discovered.  The assumption in the 1950s that life originated in shallow seas is giving way to the notion that it may have started in deep-sea thermal vents or in venues sporting other extremes of heat and pressure.  We'd have to know considerably more about how life originated here before we could make any sensible statements about what it needs to get started universally, or about what sorts of forms life might take besides the ones we're used to.

Bullies

Bookworm has a post up about bullying.  I found myself trying to recall how it was handled when I was a kid, but honestly I can't remember a single example.  Did I grow up in some kind of pacifist's paradise?  Now and then kids would be mean to other kids, ostracizing them, forming nasty little cliques, but I can't remember anyone being beaten up or physically terrorized.  Sometimes the outdoor play in the neighborhood got a little rough as kids experimented with projectiles and informal combat, but it seemed to be equally rough on everyone who got in the way, not directed at any special scapegoats or victims.

Did any of you grow up in similarly bully-free schools or neighborhoods?  If you grew among the bullies I read about all the time, did people fight back?  Get the adults involved?

The Unity of Consciousness, Part II

In the first part, we discussed a problem about how we could know the world, which is related to another problem about how we can share knowledge (and therefore test its validity).  Let's take a step backward to see how the problem arose.  As mentioned, Kant's model came to him while he was crafting a response to Hume.  Hume denied the Aristotelian model of knowledge, which had underlay scholarship for hundreds of years.  Kant's problem was that Hume's attack was a challenge to the doctrine of cause and effect; but it also challenged the Aristotelian concept of what it meant to have knowledge.  We're going to look at the Aristotelian model that Hume was challenging.

What we finally came to in our last discussion was an idea (from Tom) that knowledge isn't an internal mental state -- rather, it is a kind of relationship between you and the thing you know about.  There's a contemporary school of philosophy that believes just that; but it is also true of the ancient position.

As Aristotle explains in De Anima and elsewhere, knowledge comes to be in us via a process that starts when we encounter the unknown thing.  First we must perceive the thing through our senses.  Either the sense itself or (in cases where more than one sense is involved) our "common sense" will present us with an image of the thing in our minds.  This image in our minds is very similar to what Kant was calling our representation, but for Aristotelians it is not knowledge.  Knowledge comes after we use our imagination:
To the thinking soul images serve as if they were contents of perception (and when it asserts or denies them to be good or bad it avoids or pursues them)....  The faculty of thinking then thinks the forms in the images, and as in the former case what is to be pursued or avoided is marked out for it, so where there is no sensation and it is engaged upon the images it is moved to pursuit or avoidance. E.g.. perceiving by sense that the beacon is fire, it recognizes in virtue of the general faculty of sense that it signifies an enemy, because it sees it moving; but sometimes by means of the images or thoughts which are within the soul, just as if it were seeing, it calculates and deliberates what is to come by reference to what is present; and when it makes a pronouncement, as in the case of sensation it pronounces the object to be pleasant or painful, in this case it avoids or persues and so generally in cases of action. 
In other words, we take our initial image and use our imagination to add or subtract qualities.  In this process, we sort out what it is that makes the thing that kind of thing -- the purpose, or function, which Aristotle calls the 'final cause.'  The beacon can be lit or not; and in sorting out the difference we learn what it is that makes it a beacon and not just a fire (i.e., that it is lit only when the enemy is coming; and thus its final cause is to warn us of invasions).  A chair can have two legs or three or four; or it can be blue or red.  None of these things causes it to stop being a chair.  However, if it is too small, or broken, it cannot be a chair (though it might, in the first case, be a toy chair).  A bird can be bigger or smaller (and even flightless!), but it serves a purpose (its own purpose, that is:  it sustains itself as a bird, and is involved in the production of more birds of that type).

At this point, we have knowledge.  In Aristotle's terms, the final cause is normally also the formal cause -- that it, it is the form of the thing.  The form of the chair or the bird comes to be in our minds.  That is real knowledge, without mistake:  we possess the form.

There are a couple of problems with this approach.  It will jump out that Aristotle is using at least one and possibly two invisibles of the type that the West has come to fear since Ockham.  "Form" isn't visible except when expressed in matter; the form in our minds is visible only as an image in our minds.  Likewise, Aristotle puts all this down to the working of the soul.  It seems like we could simply say that he's using "soul" where we would use "mind," but that's not right:  the soul turns out to be another form.  In fact it is our form, the organizing principle that makes us who we are and gives us our purpose (which, for Aristotle, is to seek understanding through rational activity; but you can take the more pedestrian view that our purpose, as with any animal, is merely to sustain ourselves and produce others like us).

The other problem is that Aristotle has a difficulty with how the form could come to be in our minds.  In the Physics, he gives an account in which any sort of motion is a movement of a thing from potential to actual (or a falling away:  a house can move away from being a house by collapsing, so that it is again only a potential house).

So if the form comes to be in our minds, it must have already existed there potentially.  That's a very interesting claim, but it is a claim that makes sense of the idea that there is a relationship between us and the world.  It's a much brighter picture than that which comes from Kant, because we really have knowledge -- the actual form of the actual things -- and it makes sense that we can convey that knowledge to others.

But then you realize that this means that all forms must exist in our minds potentially -- how could that be the case?  (The claim is not as shocking as it sounds at first:  if you think it through, you realize that it really must be true that, if we can have knowledge of X today, we must have had the potential to know X yesterday.  Thus, it follows that you now potentially know everything that you could actually know.)  It makes a kind of sense on something like an externalist picture:  we are part of the world, not separate from it, and thus we are related to the world in certain ways.  One of those ways could be having a mind shaped for knowledge of the world.

There is another problem, though, which is that we can also obtain knowledge through contemplation alone:  for example, we can come to knowledge of mathematical truths simply by thinking.  We are never encountering an actual form in an actual thing; yet we are coming to knowledge all the same.  That means not only that we must have the potential for the knowledge in our minds, but that we need an account of where the actual form is that we are grasping.

Aristotle's solution is to posit an "Active Intellect," which is to say a kind of universal consciousness in which all human minds participate.  This is a surprising solution, very much unlike Aristotle -- it's almost Platonic, and very similar to what the later neoplatonists will suggest.  This Active Intellect contains all the forms in an actual way, and thus this explains how our minds can obtain knowledge through contemplation alone.

The modern urge is to do away with "forms" as invisible or mystical, but remember what forms are:  they're organizing principles that structure matter in a particular way.  These things certainly exist:  this is what DNA does, for example; or, if you like, the difference between hydrogen and helium is the way in which its matter is ordered and structured.  So forms are real enough; and they do exist in an actual way, and come to be in our minds when we grasp them.

Here the problem is the opposite one we had before.  There are large parts of this picture that really work, and are highly satisfying; but there remain some troubles we have to sort out.  Let's stop here and talk it through.

Continuing scandal dogs embattled Secret Service

This time they actually let someone get hurt. Investigations are underway to determine whether alcohol and underage hookers were involved. Speaking of which, I'm unable to give this image proper attribution because my husband got it off of some wargamers' site:

A Couple from Hoyt Axton

Not a well known name these days, but a man with a strong voice.





The Kingston Trio did some of his pieces.

For The Women Who Love Us:

Waylon Jennings has a famous song about a good-hearted woman, who loves a man who may not be good enough for her.  What isn't as well known is that he has another song on the subject, almost to the same tune but with a slower tempo:  and a more intimate.



All of us who have been long from home will understand:  and all of you, whose men have long been gone.

Setting the Bar:

It is very rare that a man should be at once under consideration for both the Medal of Honor, and sainthood.

There seems to be only one other example, which I owe to Deltabravo's posting in the comments of BLACKFIVE.  Even looking outside the United States, to degrees on par with the Medal of Honor, there are perishing few.  Joan of Arc, Alfred the Great -- him to only some Catholics -- perhaps Olaf or Edwin of Northumbria, perhaps St. George, and not many others.

It's a rare company.

On a Father's Love

We didn't say anything here about the infamous controversy of Ms. Samantha Brick, which probably most of you noticed a few weeks ago (I would guess since it pervaded even the parts of the internet that I normally visit). There wasn't much to say about it except that most of the negative reactions were unjustified, since no amount of inflated self-esteem could account for the regular buying of free drinks and other attentions that generally do accompany beautiful women. However she might have appeared to the multitude who wrote to insult her, to those men at those times she plainly was a joy, and her presence an honor to which they wanted to pay tribute.

She has written a followup piece, though, that probably deserves comment. It is about her father, and what his constant love did for her.

 This piece, far more than the other, is a thing worth conveying to all who might hear it.

A Moment of Unity:

I don't think we've mentioned Atrios here for most of the decade he's been blogging -- looks like once in 2003 and once in 2005 -- but he was significant to the left side of the blogosphere at the beginning.  He's celebrating his ten year anniversary this week and, a few minor disagreements on tone aside, it's hard to take issue with him on this point.  If there has been a less insightful and more overrated writer in the major media than Tom Friedman, I can't think who he would be.

Happy birthday, Duncan Black.

While we're celebrating this moment of comity, is there anything at all to object to in the following segment?  The Breitbart boys are trying to blow it up, but generally I think there's a lot of sound advice in it.  You can put that down to broken clocks and twice a day, or to whatever else you like, but all things considered there's a serious issue at work here.



I saw Richard Cohen say today, of Paul Ryan's budget, that it was:
...an Ayn Randish document whose great virtue is a terrible honesty. (We are indeed going broke.)
If you think through the consequences of that, Mr. Farrakhan's warning has a different sound.  If we have reached the days when the bipartisan blinders can't keep out that fact any longer -- that fact and all the consequences for our society that it portends -- he might not be too far wrong.

Modest WaPo Dislikes Spotlight, Wide Circulation

You know how it is. You just want to publish a newspaper to a few like-minded citizens, without all the fuss that comes from robust circulation numbers and other unwelcome attention. Then your news desk gets a story suggesting that the Affordable Care Act will increase the budget deficit by $340 billion (or even as much as $527 billion) instead of reducing it by $132 billion, as the President previously had claimed. Your editors get together and decide that it's sort of news, but it doesn't deserve to be highlighted; you put it below the fold on page A3. You make sure it is "prominent on the home page for only a short time."

But the unruly public refuses to go along with your expert judgment of the story's unimportance. Brash bloggers turn on the high beams. Before you know it, Drudge links to it under the headline "ObamaCare Explodes Deficit." (You hate it when they call it "ObamaCare." It's "derisive.") Next thing you know, conservative and liberal bloggers are abuzz with citations to your story and arguments about how the budget impact should be calculated.

The wise old heads at your publicity-shy news desk all recognize a familiar futile attempt by the unwashed masses to determine truth and falsehood. With their superior sophistication, the new desk professionals grasp that
The truth is that every complex law change, every annual federal budget, is a risk. They’re all based on assumptions and forecasts that may or may not come true. And when they don’t, Congress and the president have to adjust.
Just because someone points out that a vast budget impact, which was widely reported and heavily relied on in the process of getting the law passed, was transparently based on double-counting (the Medicare "doc fix"), and is off by the better part of a trillion dollars, doesn't mean it's news. It's just all part of the inevitable world of forecasts and assumptions that may not come true. Happens all the time. Nothing to see here. Move along.

But it's too late. The Washington Post's ombudsman sadly acknowledges that the paper gets a "frisson of pleasure" from the attention that a hot story attracts -- but they're above all that. They're more interested, apparently, in pushing their favorite agenda. So they really wish people would let them give unfavorable stories a quick, decent burial below the fold on page A3 after running for an eye-blink.

Something's really got to be done about making the new media shut up.

Colombian Coup

Not that kind, the PR variety.

The Unity of Consciousness, Part I

Joseph W. asked for a separate and new thread to discuss this subject, which arose in our discussion of problems of creation.

Even a summary of this problem -- indeed, even a book-length summary -- would necessarily compress a massive amount of careful argument.  What I am hoping to provide here is more like a sketch of a summary of the problem; to tackle the problem with the seriousness it deserves is the work of years, not a few hours.  The basic problem is twofold:  how can I have knowledge about the world, and how can I communicate regarding knowledge of the world with other minds in a useful way?

Note that this is different from the question of "how/why did communication between intelligent beings arise?"  One can accept an evolutionary response to that kind of question:  it arose because, when 'tried' by animals who happened into it, it proved valuable.  This is a different question, which is about how (and indeed whether) it is possible for such a thing to be at all.  If evolutionary utility were the only criterion, why do animals not teleport themselves or engage in other sorts of fantastic behavior?  They do not because they cannot.  They do this because they can:  but why can they?  It's a very difficult problem.

Let's start with Kant's idea of transcendental unity of apperception.  He was responding to some difficulties raised by Hume -- Hume is still today a powerful source for difficulties -- about how the mind can work.  Kant argues that when we take our sense perceptions -- sight, hearing, touch, and so forth -- we must mentally mold our various senses into a single object that can serve as an object of thought.  This is called representation (that is, we are re-presenting the sense data as an object of thought rather than as data per se).  It's not just the object that has to be represented as a whole, though:  we must also represent all of our disparate experiences as a kind of unity, the unity we take to be ourselves (for what are we if not the sum total of our experiences?).

One consequence of this approach is that we end up being unable to have any knowledge at all about anything in the world.  Those things are not what our minds represent to us:  the unity imposed upon them is artificial, for one thing.  Thus, what we have "knowledge" about is only our representations, not the things themselves.  Kant calls these things "noumina" and our representations "phenomena," and argues that noumenon are completely unknowable by human beings.

That's going to be a problem for communication about the world -- for science, say.  We think that we are engaged in learning about the world through the scientific method, which involves experiements, measurements, and then communication of our results to see if others can reproduce them.  If Kant is right, no part of that approach works the way we think it does.  Our experiments are not of the world, but of mental phenomena that are different from the world in ways we not only cannot know but cannot conceive.  Our measurements are likewise.  Our theories about the meaning of these results are thus doubly disconnected from reality, because they are theories about theories about what things are really like.  That's problematic enough, but now I need to convey them to you for you to try to reproduce.

You've got your own set of representations.  Since neither you nor I have access to the things in the world, but only our individually constructed representations, we have absolutely no way of knowing if we are talking about the same objects.  When I communicate my ideas to you, what I think I'm saying to you is being filtered as sound impulses and then re-presented by your mind to you according to your own unity of apperception:  thus, I have no idea what you're hearing when I tell you something.

We might be satisfied to say, "Well, my own unity will represent all input in a coherent way, so while I don't really know if you're agreeing with me or not, it will appear to me that we agree on the basic facts."  That would make sense, but it doesn't explain why science appears to give us increasing new capacities to do physical things:  we can work together to produce rockets that fly to the moon, for example.  That's a capacity that suggests that we really are cooperating:  there's nothing in our pre-existing unity that should suggest it.  It is a capacity that arises from this cooperation, which suggests that the cooperation is real.

We might say, "Well, let's stick with the evolutionary explanation.  Our brain structures are similar enough that we can 'understand' each other to a certain degree because similar structures produce similar representations."  Even if this were fully adequate, which it isn't, it doesn't make sense of the problem of why we can understand things that aren't like us.  I usually use horses as a model for examining the question of a unitary order of reason across species (an idea also rooted in Kant, via Sebastian Rödl's explorations); but we have a similar capacity with animals of any kind.  We seem to be able to distinguish between animals that are reacting to a pre-programmed instinct versus those which seem to have a capacity to reason and learn, for example, even if we don't share much evolutionary history with them.

The explanation is also inadequate because it simply doesn't answer the depth of the problem.  Kant's argument gives us a world in which we can have no knowledge whatsoever of the reality around us, including the minds of others.  To argue that our brain structures are 'mostly similar' is thus to argue facts not in evidence.  We can't know any facts about the structures of our brains, only about the phenomena of the structures of our brains -- and these are likely being represented according to a pre-existing internal order that makes them accord to some degree with what we expect from them.

It also just doesn't make sense to leap from "it is impossible to have any knowledge whatsoever about the things themselves" to "nevertheless, we seem to do a pretty good job."  You can't jump from "impossible" to "a pretty good capacity" in the same way that you can't build a line out of points.  The points have no extension, so no number of them added together will give you an extended line.  Likewise, no amount of phenomena can be combined into a noumenon:  no phenomenon contains any nouminal content.

This has led people to question, well, everything:  it has led otherwise serious people to wander around speculating about Zombies (which set of arguments, by the way Joe, is very similar to the ones you cited to me re: whether AIs would have real consciousness); or mad scientists keeping our brains in a vat.

Or it has led people -- particularly practical-minded people -- simply to ignore the problem and pretend it doesn't exist.  This science stuff seems to work; why worry too much about why it works?

I suppose I will stop here, and call this "part one," because there remains a great deal to be said about what I think the right way to resolve the problem happens to be.  For now, though, maybe we should stop and take a moment to appreciate the problem.

Economics & Medieval Norwegian Coins

Studies of medieval coins in Norway suggest a more complex economy than is commonly pictured:
The trick is in the coins’ metallic body – a mix of copper and silver that makes them much less sturdy than coins from present times. Medieval coins were easily frayed by everyday use, and by studying the degree of this wear and tear, Gullbekk was able to come up with rough estimates of how many hands the coins have seen in their lifetime. 
Gullbekk explains that if one knows the time period certain coins were used, one can make a well-informed guess of the coin’s circulation velocity in the years it was used as currency.
I bet this trick would work for the period of Anglo-Saxon coins in England, as the government was apparently successful in forcing everyone to turn in their coins to be melted-down and re-struck periodically.  Thus, when we do find coins from a particular period, it's in hoards whose age can be estimated fairly precisely.  You could take those coins and have a very reasonable estimate of their life in circulation.  (The only problem is the relative rarity of such coins, since -- as mentioned -- the government was fairly successful at collecting up old coins and melting them down.)

H/t:  Medieval News.

A Wise Notion

The Guardian describes Dr. Terry Eagleton's new position on literary theory.  His old position was to declare that there was no quality or set of qualities that could define "literature."
Eagleton has not reneged on scepticism: he is just sceptical about it.
That strikes me as very wise.  There is nothing that should more stimulate us to be skeptical than skepticism.

Dangerous choices

Here's something I like to see: states trying something new with the public schools on a large enough scale that we might be able to draw some conclusions. Louisiana Governor Bobby Jindal handily won a second term with a campaign that leaned heavily on education issues. He put together impressive bipartisan support for an education reform bill that will put a lot more choice in parents' hands, using vouchers, additional charter schools, and tenure reform. These reforms expand on a tiny trend begun as a crisis response in the wake of Katrina:
Only in New Orleans, where devastation from levee breaches during Hurricane Katrina led to an extreme makeover of schools, have results been dramatic. Although there were bright spots, city schools as a whole were among the worst-performing in the state before the disaster.

Since the state took over most schools post-Katrina, that is changing. Recovery School District students, including charter and traditional campuses, posted their fourth consecutive year of improvement last year. The proportion of students scoring at grade level or above grew to 48 percent in 2011 ­-- more than double the percentage in 2007.

That progress has come as most city schools became public charter schools, a concept that the governor's legislation would expand statewide.

Some opponents of the reform legislation have tried to make charter schools seem like a questionable experiment and point to the failure of some schools. But there are highly successful, stable charter schools in New Orleans. And the fact that some unsuccessful schools have been closed down is a sign that the system is working.
Grim and I sometimes argue about the value of the free market. He is skeptical of its tendency to monetize values that should be beyond monetization. I in turn am drawn to its way of putting choices in the hands of the recipients of goods and services. The advantage of competition is not that someone wins and someone loses. The advantage is that customers can gravitate to what succeeds and abandon what does not. The "losers" in this contest aren't doomed to bleak lives in hovels after their customers withdraw their resources and support. They can always adopt the winning strategies if they like, and quit losing. What they can't do is force their customers to keep coming back to hear a new set of excuses for failure. Parents don't have to agree or disagree with any of the excuses. They can simply go to another school, which is getting better results with a different approach.

Does this approach protect us against parents who make poor choices? Of course not. Making up for bad parents is beyond the capacity of a public school system, as failing schools are always telling us.

Scale

Don't miss this XKCD graphic about ocean depths.

Engagement

From a not particularly snide N.Y. Times story about the famously combative Andrew Breitbart, an anecdote from his wife at his funeral: “I came home one day to our first apartment to find a couple of Jehovah’s Witnesses,” she wrote, “trying to wrap up the conversation and get out.”

Stuffed with Stuff

Bookworm is talking this week about being oppressed with stuff. I have a few hoarding genes, nothing too extreme. Not like my poor aunt, who barely can stand to let me throw away a tissue when I visit the tiny, crowded hospital room where she still languishes, now quite close to the end, and who still obsesses over the possessions she had to leave behind when she moved into an assisted-living facility ten years ago. Certainly nothing like the sad souls who immure themselves into houses jammed to the ceiling in every room. I'm more a case of too much laziness to sort through and dispose of what I don't need or use. I do notice, though, that fatal internal message that says, "Don't throw this out, even though you haven't used it in five years. Someday you may want it," which is the siren song of the hoarder.

When I was young and unencumbered by possessions, I used to love it when relatives decided to shed their excess stuff. I had nothing but cheap, utilitarian, boring new stuff and coveted their funky old objects. When I could afford it, I would shop at antique stores for the same kind of thing. Every middle-aged or old person with too much stuff should have a niece like me, to accept and cherish their wonderful old things. My bedridden aunt doesn't so much miss possessing her old things as worry about them, as if they were puppies that need to be adopted into a loving and appreciative home. The problem is, these concerns extend not only to nice old furniture but to boxes of empty jelly jars. I can't even clear out her greeting cards unless I agree to take them to some church group that plans to cut them up and use them in crafts. Part of this is Depression-era thrift, of course, but the rest is just anxiety and alienation. She's about to cast off a lot more than her stuff. She's going through a door she can approach only empty-handed.

My taller half is considerably more orderly than I, and gradually has converted me to an appreciation of unclutter. Not that I achieve much unclutter, but now I do at least aspire to it and occasionally take lurching, partial steps in that direction. We managed to scrape off quite a few barnacles when we moved here six years ago. It's time for another wave. Anything that's still useful needs to go to the local thrift store, and the rest to a landfill.

Maybe I'll find my missing dulcimer.