Thursday, May 15, 2008

Useful link of the day

How to secure your Windows computer and protect your privacy online.

A very useful compendium aimed at the non-expert, describing the most common threats to security and privacy, and listing a wide variety of software tools, most of them free, for avoiding and fixing problems.

Good stuff.

WTF is he blathering about? Does he even know?

David Brooks has a meandering, self-contradictory column up at the Times today that makes me wonder if he's capable of rational thought at all.

Here we go:

To these self-confident researchers, the idea that the spirit might exist apart from the body is just ridiculous. Instead, everything arises from atoms. Genes shape temperament. Brain chemicals shape behavior. Assemblies of neurons create consciousness. Free will is an illusion. Human beings are “hard-wired” to do this or that. Religion is an accident.

In this materialist view, people perceive God’s existence because their brains have evolved to confabulate belief systems. You put a magnetic helmet around their heads and they will begin to think they are having a spiritual epiphany. If they suffer from temporal lobe epilepsy, they will show signs of hyperreligiosity, an overexcitement of the brain tissue that leads sufferers to believe they are conversing with God.

That's right. Back in the 60's, those awful materialists suggested that religious experience may be the result of specific states of the brain, that it can be mapped and measured materially. Oh, how silly! How foolish those arrogant materialists were!

Fast forward a few years....

The brain seems less like a cold machine. It does not operate like a computer. Instead, meaning, belief and consciousness seem to emerge mysteriously from idiosyncratic networks of neural firings. Those squishy things called emotions play a gigantic role in all forms of thinking. Love is vital to brain development.

Researchers now spend a lot of time trying to understand universal moral intuitions. Genes are not merely selfish, it appears. Instead, people seem to have deep instincts for fairness, empathy and attachment.

Scientists have more respect for elevated spiritual states. Andrew Newberg of the University of Pennsylvania has shown that transcendent experiences can actually be identified and measured in the brain (people experience a decrease in activity in the parietal lobe, which orients us in space). The mind seems to have the ability to transcend itself and merge with a larger presence that feels more real.

Let's work through this. The brain does not work like a machine, but consciousness emerges from neuron firings. Emotions play a role in thinking (which says nothing about the physical mechanism underlying either.) Genes are not merely selfish--well, I'm not sure what that bit of anthropomorphism is supposed to mean, other than the suggestion that natural selection may have selected for those traits in social species such as ourselves, due to a survival advantage gained by them. Nothing mystical there.

And transcendent experience can be identified and measured in the frontal lobe. Sounds pretty materialist to me. His statement that the mind has the ability to "transcend itself and merge with a larger presence that feels more real" describes the subjective state of feeling merged with the universe, with God, whatever, that is frequently described by meditative practitioners of various faiths. In other words, the mystical experience can be linked to specific brain activity in specific brain regions. Again, very materialist. The subjective feeling of something larger being there doesn't mean there is something there, but he overlooks that.

So, to summarize: Back in the 60's, the bad materialists said that religious experience was probably a function of brain activity. Today, the good scientists are finding that it is, and describing it objectively. This is a repudiation of the earlier view.... how?

But wait, it gets better.

In unexpected ways, science and mysticism are joining hands and reinforcing each other.
In his mind, maybe.

Orthodox believers are going to have to defend particular doctrines and particular biblical teachings. They’re going to have to defend the idea of a personal God, and explain why specific theologies are true guides for behavior day to day. I’m not qualified to take sides, believe me.
The last sentence, at least, is correct.

The "profound insights" Brooks cites, evidence that science is "proving" that a quasi-buddhist mysticism is objective truth, is a rehash of second-year philosophy:

First, the self is not a fixed entity but a dynamic process of relationships. Second, underneath the patina of different religions, people around the world have common moral intuitions. Third, people are equipped to experience the sacred, to have moments of elevated experience when they transcend boundaries and overflow with love. Fourth, God can best be conceived as the nature one experiences at those moments, the unknowable total of all there is.
The first has been known even to materialist social-psychologists for years. The second 'insight' is still a matter of some debate and not considered proven. Third, yes, people can feel something, they can experience something... and we know what parts of their brains do that. Which he railed against in the first few paragraphs, then trotted out as profound later on. The fourth? Well, define "God" however you want. That definition is one that's been bandied about. His point that there's no evidence for a personal, biblical, magic daddy in the sky, is of course correct. But in an act of moral cowardice, he refuses to carry that thought through. Why should we believe in any god at all, even one that is more process than personality? Indeed, if we can show that the experience of god is just a function of brain activity, doesn't that suggest the lack of anything non-material undergirding it? Doesn't that reinforce the "radical atheism" he deplores so much?

Tuesday, May 13, 2008

Conversation with a friend

An email exchange with a friend of mine, regarding this article, and this reply. His comments are in Times Roman, mine in Arial.

A midwife for one's thoughts; another for whom one can provide the same service: Basis of a good conversation.


Here's my major beef with the article (and it's a point that the replier gets to as well, at least obliquely). The article writer says:

"Above all, these changes would require looking with fresh eyes on the landscape of academic disciplines, and noticing something surprising: The great wall dividing the two cultures of the sciences and humanities has no substance. We can walk right through it."

To which I say: funny that your idea about walking through the science/humanities wall involves turning the humanities into a science. I don't mean to sound all boundary-policing for the sake of boundary-policing here, but I do think the article is guilty of arguing for a merger of science and literary criticism where the exchange of methods seems to be entirely one way: from science, to the humanities and never the other way. He makes a good case for WHY we should study literature (one I totally agree with), but he doesn't really say what -- if anything -- the study of literature might offer scientific study.

I'm not sure how much of it is just that that's not the point he's making right now, i.e. is that another article for another day. And he led off with the observation that hmmm, the sciences seem to be obtaining new knowledge, new insights into our nature and condition, and moving forward in a way that literary criticism isn't; what are they doing that we're not?

And going a bit beyond what he said... If literary criticism is barely surviving on its own, if it doesn't have much to offer itself, well...what DOES it have to offer scientific study? (Though you touch on some of this below.)

His basic point about how most lit. critics view the human brain is true to. This, I think, is the biggest current divide between humanities and the "hard" sciences. (Or at least it's what my nueroscientist friend Stephen and I end up screaming at each other about over drinks every few months . . . ) The sciences now favor an almost entirely biological account of human coginition/development/ability ("It's biological, it's genetic, it's physiological, etc."), while since at least the late 60s forward, English speaking literary academics have generally favored a social constructionist approach. (The middle ground here would be the 70s feminism articulation of the difference between biological sex [do you have a penis or a vagina?] and socially constructed gender [Having a vagina means you must wear dresses and play with dolls . . . ], while the extreme of this position, an extreme which I myself am uncomfortable accepting, is Judith Butler's explosion of the sex/gender binary, where she argues that nothing exists before or outside of language and that EVERYTHING is gender, EVERYTHING is socially constructed and there is no biological "sex" that's not already caught up in culturally constructed meanings.])

If she's going to make that argument, then she has to explain why the vast majority of societies throughout history have associated having a penis with physical strength and aggression. Is a preference for blue rather than pink culturally constructed? Almost certainly. Does it therefore follow that NOTHING is biological? I, for one, am far from convinced.

Part of the reason for the biological emphasis is that with advances in technology, we can answer questions we couldn't even ask a few years ago. So yes, it seems like everywhere you turn around there are new biological insights. Look at what happened after Darwin published. His central idea was that species could develop and change over time, through a natural process of selection; that the environment could make the same kinds of decisions familiar to any livestock breeder; and the implication that all life descended from a common ancestor.

This was a revolution in biology, and there was a rush to apply Darwin's ideas EVERYWHERE, including places they didn't apply--thus Social Darwinism, etc. With time, the excesses were worked out, and scientists recognized that this application is valid, that one is just silly, and the other is a useful metaphor that shouldn't be taken too far.

We're currently in the early explosion stages of a biological revolution. There are things we've thought of as social that may turn out to have very strong biological components. There may be some things that go the other way. But we haven't mapped out the limits of the new knowledge yet.

(At one time, 'everyone knew' that ulcers were caused by stress....until the role of heliobacter pylori, a bacterium that lives in the gut, was discovered. Stress still has an effect, but only because stress tends to weaken the immune system, and in America, stressed-out people are more likely to eat foods that H. Pylori thrives on, and smoke too much, thus producing plenty of extra stomach acid to irritate already-irritated tissue, etc.) Likewise, 'evolutionary psychology' has given us some useful insights into how seemingly anti-survival traits such as self-sacrifice, altruism, generosity, etc., actually lead to better chances of group survival. It's also made some risible claims that can't be taken seriously, and aren't, even within the field.

And for me? Personally? I've been struggling most of this year with Butler's position, the full adoption of which is still very much in vogue in literature departments. I think Butler's wrong. Mostly because I don't think that my friend Stephen is an idiot. And I don't think that the advances or observations of Stephen's field are entirely bunk. That said, I am ENORMOUSLY suspicious of strict, biological (or genetic) essentialism. This gets us back to that other article that Sullivan linked to that we discussed recently, which you said (and which I agreed) veered a little close to "some people are simply, inherently unfit to learn, so let's not let them go to college and oh isn't it funny how all the people who are fit to learn and go to college all look white and male, just like me."

I agree with you here. Yes, biology influences us, in ways we're not aware of and don't fully understand. That doesn't mean biology is destiny, though. Antidepressants affect the levels of serotonin in the brain. But SO DOES COGNITIVE-BEHAVIORAL THERAPY. Yes, ultimately it's all encoded in the physical matrix somehow (otherwise you're postulating a source of thought external to the body), but that doesn't mean the physical level is the most convenient (or ethical) place to intervene. Perhaps love can ultimately be explained in terms of biochemistry and neurons firing--but as anyone who's been in love can tell you, it's not *just* that.


I think biological essentialism (the body or someone's physiology or genetics being read as their unalterable, inescapable fate) is a bad, bad, bad, super scary thing. And me thinking that isn't based in namby-pamby, soft humanities psuedo-science. It's based in the empirical evidence of history. I mean, slavery comes to mind ("These dark skinned people have smaller brains and therefore smaller mental capacity, oh and they also have higher pain tolerance, so obviously they were meant for heavy labor . . . ") as does the Holocaust ("These people have the wrong eye color/hair texture/skin color/nose shape/whatever and are therefore deserving of elimination.")

The problem is, simple indicators have been used to stand in for other traits, or generalizations have been applied without regard to the possibility of exceptions. Suppose we do extensive research and find, lo and behold, that the IQ of the average black person really is 2 points lower than the average white IQ. Do we therefore stop giving scholarships to blacks, since "they're not as smart as whites?" Of course not. Group averages say nothing about individual talents or temperaments, and (as in this example) the magnitude of the difference is insignificant. The relationship between brain size and IQ is poorly understood, if it exists at all. Not only has there been an overreliance on science, there's been an overreliance on BAD science. In the long run, science is self-correcting--but in the short run, can be misused as much as anything else.
And so science's hard turn to serious biological essentialism scares the fuck out of me. (As Eve Sedgwick, one of those evil postmodern namby-pamby literary scholars that the article bemoans says, You ever notice that there are all these scientific studies to find the 'cause' of homosexuality? But none to find the 'cause' of heterosexuality? What else do we try to find the 'causes' of? Hmm . . . the common cold, liver cancer, lukemia, ALL THESE THINGS THAT ARE PROBLEMS, THAT WE WANT TO ERADICATE.)

Then explain, please, why so many psychologists (particularly those coming at it from an analytic viewpoint) working on the origins of homosexuality are themselves gay. Do they want to eradicate themselves? Is it the stereotype we used to joke about in social work, that you become a therapist because you want to figure yourself out? You're right to point out the effect of heteronormativity, that the default normal state of human beings is hetero and anything else is a deviation from that, so we should find out what causes that divergence. I'm not quite as willing to assume the motives are hostile, though the effects can be just as pernicious.
Or, to put it another way, I guess what bothered me about the article was it's blind, uncomplicated faith in empiricism and the scientific method. And to make my case why that's a problem, I'll actually turn to a scientist, not a humanist. I'm thinking here of Stephen J. Gould's THE MISMEASURE OF MAN, specifically the part where he goes back and re-does all of that 19th century Philadelphia scientist's studies on the cranial capacity of various races, and finds out that the guy was wrong. That he fudged his data to make it fit the racist worldview that he wanted to argue for. And of course, Gould points out that he himself might be making mistakes that are fudging his own data to try and support a non-racist, multicultural worldview that's currently in vogue.

Through the 50's to 60's, ecologists & biologists looking at animal behavior theorized that aggression had a large survival payoff, that the most aggressive animals would tend to have the best survival characteristics. The field was, at the time, dominated by men. As more women came into the field, they started asking questions about cooperation as a survival strategy, about whether it might be equally valid or even superior in some cases. After all, many species show cooperative strategies: wolves hunt in packs, primates are very social, etc. And lo and behold, they found evidence to support that. The men who asked the questions earlier weren't stupid, weren't corrupt, weren't evil. Their worldview led them to ask certain questions, and as we both know, the way you ask the question has a large influence on what answer you find. Someone with different assumptions asked different questions...and found new results.

There are any number of studies from the 50's and before, comparing the various races & so forth, and finding that southern Europeans are flighty and undisciplined, Africans are a bit simple and childlike, etc., always with the amazing coincidence that the northern European or Scots-Irish (depending on who was doing the ranking) was the most advanced group, with all other groups inferior in some ways.

Today, of course, it's impolite to suggest that there might be ANY difference of ANY kind. I'm not sure which extreme is more annoying. (Is it just a coincidence, for example, that cultures in climates with a short growing season have a much greater sense of time urgency, of 'don't put it off, do it now,' than parts of the world with a more benign climate? That cultures where it's colder, so people are usually dressed in more layers, have a stronger nudity taboo? And what if it turns out that members of certain ethnic groups really are slightly more outgoing, or whatever, than some other groups. Explain how that means I should treat the person across from me any different, please.)

Over the long run, science is self-correcting. Mistakes get flushed out, re-examined, discarded. The same principle applies in the humanities as well. Case in point: The romanticized view of plantation life exemplified by "Gone with the Wind" was based largely on a scholarly historical study on the life of southern blacks before the civil war. (The name of the study escapes me at the was considered THE standard work in its day.) It wasn't a polemic, it was good, solid, primary-sourced scholarship. And it strongly indicated that most plantation slaves really didn't have it all that bad, and though there was a lot of hard work involved, they were essentially well treated.

Well. Some other historians looked at this later. And noticed that the work was indeed primary-sourced with original documents--almost all of them from slave holders or slave dealers. So they applied the scientific method and asked what other evidence there might be for that proposition, or what counter-evidence might exist. And in the process uncovered lots of other documentation suggesting the earlier view was wrong.

That was an application of the scientific method to the humanities, with good results. (More on this later.)

I guess what reading Gould taught me was EXTREME skepticism, a wariness of how our cultural programming creeps its way into even what appears to be objective, detached, rational analysis.

And THAT, I think, is what the humanities have to offer the sciences. Experience in looking at the non-technical assumptions, of looking at what's being asked and what's NOT being asked, about what's being taken for granted.

So do I think we need to abandon science or rationality or empiricism? No. Not at all. That way madness lies. Or at least that way George W. Bush and his administration lie. (Which is pretty much the same thing as madness, but that's another rant.) We've got to strive for empiricism and rationality . . . but we also need to be aware of the dangers of assuming that "empirically" staged experiments are perfect. Or that things conducted under the name of "science" are, unquestioningly, captial T "TRUTH."

The difference between science and fundamentalism is that science regards truth as contingent, provisional, and mediated by evidence, refuted by a single counterexample. As long as you remember that, you maintain enough humility to be careful in your conclusions and your statements about Ultimate Cosmic Truth.
I think we desperately need empiricism in the world right now, and literary studies could damn sure use more of it. But the article Sullivan linked to worried me because it seemed ready to worship at the altar of empiricism without acknowledging any of the valid pitfalls/complaints that some scholars in the humanities have raised about it.

Does that make any sense?

Your comments make perfect sense, and I think we agree more than we don't. Now, if I can get up my soapbox for a bit:

I stand by my earlier statement that one of the best preparation for graduate study in the humanities would be a standard first-semester calculus course. NOT because you're going to need to apply the definition of a derivative or prove the chain rule, but because it teaches rigorous thinking and analysis, the habits of thought that you'll need later on.

Its not unlike some of my comments when you've sent me your papers and I've looked them over, and I've had to remind myself that standards of proof are different in the humanities. And the article touched on that, or at least pointed toward it. In lit-crit, it's apparently OK to say, this is what I think, and here's an example, and here's an example, and here's an example. So having established that, let's move on to the next point.

In the sciences, of course, that wouldn't fly. Do I expect that anyone will analyze and comment on Foucault with mathematical rigor? Of course not, though I'd like to see them try. But there are questions the scientific method forces you to ask, that don't seem to get asked in lit-crit:

  • Are there other explanations for these observations?
  • Do these observations really imply what I think they do? Or could they also be interpreted to support a different hypothesis?
  • Are there counterexamples out there that imply something else? (This, I think, is the most important question lit-crit leaves unasked. My own thesis, just a lowly master's thesis at a middle-of-the-road university, spent about a third of its effort looking at possible counterexamples or other explanations and trying to rule them out.)

math training drills into you, again and again, that a set of examples is all very nice, but one counterexample makes them all irrelevant, no matter how many you have or how convenient they are the rest of the time. it teaches you the difference between a theorem and a conjecture, well enough that you never ever confuse them again, and you understand how to tear apart an argument and check it for validity.

No, lit-crit will never ever be at the same level of rigor as mathematics, and it shouldn't try. But as long as it views logical rigor as a symptom of dead white european male hegemonism (I'm sure there are much bigger words to express the same idea more snottily), it's going to be stuck in its own little backwater.