No, not really, but I got your attention, yes? On the other hand, these are precisely the words used by PZ in a recent post, aimed at criticizing Michael De Dora’s observations about a recent debate in Knoxville, TN on the wording of a biology textbook.
Let me start with a full disclosure: Michael is a friend, and of course one of the contributors to this blog. But this post has little to do with that, it deals with the substance and the tone of PZ’s remarks, both of which are highly relevant to the quality of discourse within the atheist community (currently, pretty low), something I deeply care about.
First the form. PZ’s post reads like it was written by an intemperate teenager in the midst of a hormonal rage. Among other things, he calls De Dora “witless,” “wanker,” “wishy-washy,” and “sloppy-thinking”; he accuses Michael of engaging in “cowardly intellectual dishonesty” and of using a “quisling” approach. So that we are crystal clear on just how low these ad hominem (a logical fallacy!) attacks go, let me refresh your memory about the dictionary definitions of some of these terms:
Quisling = a traitor who collaborates with an enemy force occupying their country;
Wanker = a person who masturbates (used as a term of abuse);
Wishy-washy = feeble or insipid in quality or character, lacking strength or boldness;
Witless = foolish, stupid, to such an extent that one cannot think clearly or rationally.
If PZ thinks that this sort of language belongs within any thoughtful writing about rational discourse, he really needs to look up the dictionary definitions of rational, thoughtful and discourse. Then again, it is precisely this sort of theatrics that apparently makes him so popular, as nothing gets people’s attention on the internet so much as shouting as LOUDLY as possible, regardless of the vacuity of what one is actually saying.
And speaking of content, what was so witless, wanky, wishy-washy, and witless about De Dora’s post? Oh, he dared question (very politely, and based on argument) one of the dogmas of the new atheism: that religious people (that’s about 90% of humanity, folks) ought (and I use the term in the moral sense) to be frontally assaulted and ridiculed at all costs, because after all, this is a war, and the goal is to vanquish the enemy, reason and principles be damned. Michael had simply noted that the recent controversy in Tennessee was a bit less clear cut than usual: while of course creationism doesn’t have a leg to stand on, and of course biology textbooks should teach evolution without apologies, De Dora also noted that using the word “myth” when the book refers to the biblical story of creation was an uncalled for breach of the principle of separation of Church and State (if invoked in the context of a biology class in a public school). Therefore, on that narrow technical ground, and on that ground only, the creationist who complained had, in fact, a point.
Contrary to PZ’s invective, acknowledging this point is in no way a cowardly act of intellectual dishonesty. On the contrary, it is a paragon of intellectual honesty because one is able to maintain the nuance that is necessary in distinguishing positive science education from gratuitous religion bashing. (And please, do note that I’ve got plenty of credentials in the department of religion bashing, but I try to do it in what I consider the appropriate manner and context.)
James “the Amazing” Randi is an icon of skepticism. The man has done more — over a span of several decades — to further the cause of critical thinking and to expose flimflammery of all sorts than arguably anyone else in the world, ever. That is why I was struck with incredulity and sadness yesterday when I read Randi’s latest take on global warming. He begins by stating that, contrary to scientists’ own self-image as almost preternaturally objective human beings, “religious and other emotional convictions drive scientists, despite what they may think their motivations are.” Well, true, to a point. Many philosophers and sociologists of science have said that before (and documented it), but your baloney detector should go up to at least yellow alert when someone starts a commentary on global warming with that particular observation.
The following paragraph is perhaps one of the most astounding I have ever seen penned by a skeptic. It reads in part: “some 32,000 scientists, 9,000 of them PhDs, have signed The Petition Project statement proclaiming that Man is not necessarily the chief cause of warming, that the phenomenon may not exist at all, and that, in any case, warming would not be disastrous.”
Wow, Randi fell for the old “thousands of scientists are against science” trick! First off, I’d like to see the 32,000 signatures (there is no link from the essay). Second, last time I checked, in order to be a career scientist you have to have a PhD, so how come only 9,000 of the signatories did? Did the rest not manage to finish graduate school? But more importantly: were the 32,000 climate scientists? Because if not, then it doesn’t matter how many of them signed the petition. I can easily get thousands of medical doctors (are they “scientists”?) to sign a petition to the effect that evolution doesn’t occur, or an equivalent number of assorted PhDs to express doubts on quantum mechanics, and so on. Having a PhD in a particular field provides no expertise whatsoever in another field, and Randi, of all people, should have known this.
“History supplies us with many examples where scientists were just plain wrong about certain matters, but ultimately discovered the truth through continued research” continues the essay. Another logical fallacy. Yes, the history of science has documented many blunders made by scientists, which usually are redressed by the built-in self-correcting mechanisms of science itself. But to imply that therefore the idea of human-caused global warming is another of these mistakes is like saying “Van Gogh was a great artist and he died penniless; I am penniless, therefore I am a great artist.” It is a non sequitur.
I haven’t seen the episode in question so I can’t comment, but if Massimo is right about his observations the Penn & Teller would have made a booh-booh! Too bad ’cause I really like these guys, but it would just show that no one is perfect.
I like Penn & Teller, the magicians and debunkers of pseudoscience and general inanity. I regularly use clips from their show in my critical reasoning class, despite cringing every time Penn indulges in his “fuck this” and “motherfucker that” exercise in free speech (it distracts the students from the real point, not to mention the always lurking possibility of an administrator asking me about the appropriateness of foul language in a philosophy class). Heck, I even recently went to Vegas to see them in person, had a photo taken with Teller, and managed to tell him (to his surprise) about how my students enjoy stimulating discussions triggered by the duo’s antics.
But as we have learned recently from the Atheist Alliance / Dawkins Foundation / Bill Maher fiasco, “skepticism” is sometimes too broad a label, as someone can be properly skeptical in politics but not about pseudoscience (Maher), while someone else may be great at debunking astrology and magnetic therapy, and yet also unable to shed some huge blinders when it comes to politically charged issues. The latter is, unfortunately, P&T’s case, as made excruciatingly clear by the 2008 (season 6) episode “Being Green” of Bullshit! I just watched it last night, and I found myself wanting to call up Penn to let go a few expletives of my own. Fortunately, I don’t have his phone number.
P&T have been very good at showing that just because one is concerned about the environment it doesn’t mean that one can think critically or act rationally. Their demonstration of well meaning environmentalists signing up to ban the “dangerous and ubiquitous” chemical known as dihydrogen monoxide (i.e., water) is priceless. In “Being Green” they pull off some of the same useful cautionary tales by showing how easily people can be duped by “green guilt” into all sorts of nonsense, like walking around with gravel (for which they paid real money) in their pockets in order to feel “connected with the earth.” Even more disturbingly, the episode raises some serious questions about large scale exploitation of pro-environment sentiment by web-based companies selling “carbon offsets” that are calculated in ways which the companies themselves have a hard time explaining.
But you know even our smart debunkers are running out of arguments when they choose to introduce former Vice President and Nobel winner Al Gore as an “asshole.” Again, there may be some legitimate criticism of Gore’s arguments and even tactics, but to give him the same treatment Penn & Teller usually reserve for real assholes, like con artists who sell snake oil to gullible people, just seems the kind of ad hominem attack that reflects badly on the attacker.
Unfortunately, many people blatantly ignore Hume’s advice, moving that bar so low that banal coincidences suddenly count as “miracles,” reinforcing their preexisting supernaturalist view of the world. One such instance took place in the q&a session after a nice talk I attended a few days ago at the Brooklyn Society for Ethical Culture. The talk was by Lawrence Bush, author of Waiting for God: The Spiritual Reflections of a Reluctant Atheist.
Bush gave an eminently sensible talk, starting out with the common observation of coincidences to which human beings attribute special meaning (a secular version of Carl Jung’s discredited idea of “syncronicity”). As Bush wryly commented at one point, while it is a good idea to pause and reflect on what happens to us in life, it is rather egomaniacal to imagine that the universe is sending us messages (often through catastrophes, personal or affecting others) just so that we can learn from our experiences.
Perhaps not unexpectedly, given the somewhat new-agey flavor of some (but by all means not all!) chapters of the Society for Ethical Culture, the q&a was as irritating as Bush’s talk had been level headed. One questioner in particular related a touching story of his adoptive grandmother being diagnosed with cancer and given six months life expectancy. The grandson reacted constructively to that abysmal prediction, using the remaining time to travel with his grandma to places where she had always wanted to go. Turns out the woman lived three years, which allowed for more travel and what I’m sure are indelibly good memories.
But then the grandson went back to the doctor and pointedly asked: “You said six months, she lived three years. What are the chances of that?” To which the doctor apparently replied with a no-nonsense (if a bit insensitive, assuming things really went that way) “One in seven hundred.” The conclusion of the story is that the questioner asked “What is the difference between 1/700 and a miracle?” strongly implying that his grandmother had of course been the beneficiary of a miracle.
Richard Dawkins doesn’t usually strike me as being naive, but one has to wonder when Dawkins abandons himself to the following sort of writing about his favorite topic these days, the incompatibility between science and religion, on his web site:
“If they’ve [the creationists] been told that there’s an incompatibility between religion and evolution, well, let’s convince them of evolution, and we’re there! Because after all, we’ve got the evidence. … I suspect that most of our regular readers here would agree that ridicule, of a humorous nature, is likely to be more effective than the sort of snuggling-up and head-patting that Jerry [Coyne] is attacking. I lately started to think that we need to go further: go beyond humorous ridicule, sharpen our barbs to a point where they really hurt. …You might say that two can play at that game. Suppose the religious start treating us with naked contempt, how would we like it? I think the answer is that there is a real asymmetry here. We have so much more to be contemptuous about! And we are so much better at it. We have scathingly witty spokesmen of the calibre of Christopher Hitchens and Sam Harris. Who have the faith-heads got, by comparison? Ann Coulter is about as good as it gets. We can’t lose!”
Oh, really? There is so much wrong with these few sentences that a whole book could be written about them, but since I am no Stephen Gould (who was famous for being able to magically turn a short essay into a book length manuscript, provided the right economic incentives), a blog post will have to do. First, though, some background. Dawkins is commenting on a recent essay by evolutionary biologist Jerry Coyne, who in turn was criticizing Eugenie Scott and her National Center for Science Education. While both Dawkins and Coyne profess admiration and respect for Scott and her organization (and so do I, for the record), they are upset by what they see as an “accommodationist” stance on the question of science and religion.
Scott — who is an atheist — has repeatedly said that one cannot claim that science requires atheism because atheism is a philosophical position, not a scientific one. She leverages the standard distinction between philosophical and methodological naturalism: if you are a scientist you have to be a methodological naturalist (i.e., assume for operative purposes that nature and natural laws are all that there is); but this doesn’t commit you to the stronger position of philosophical naturalism (i.e., to the claim that there really isn’t anything outside of nature and its laws). Years ago, when I first met Genie Scott, I had a Dawkins-like problem with this. I saw the distinction as sophistic hair splitting, and told her so (she was my guest for one of the annual Darwin Day events at the University of Tennessee). Then I started taking philosophy courses, understood what she was saying, and found it irrefutable. I sent her an email apologizing for my earlier obtusity.
That said, both Genie and I do recognize that science is one of the strongest arguments for philosophical naturalism, and I suspect that in her case, as in mine, a pretty big reason for why we are atheists is because of our understanding of science. Still, the philosophical/methodological distinction is both philosophically valid and pragmatically useful, since it doesn’t serve the purposes of either science or education to fuel an antagonism between a small minority of atheistic scientists and 90% of the world’s population (those taxpayers, on whose good will the existence of science and the stipends of most of said scientists depend).
Jerry Coyne, however (with whom I often disagree, especially on scientific matters), does have a point that Scott and the NCSE should address: if the National Center for Science Education claims neutrality with respect to the relationship between science and religion, then why — as Coyne observes — do they list on their web site (under “recommended books”) a plethora of obviously biased books on the subject? Why does the NCSE feel ok to endorse the vacuous writings (as it pertains to the alleged compatibility between science and religion) by pro-religion scientists like Francis Collins, Ken Miller, and Simon Conway Morris, to name a few? Either these books should be ignored, or the NCSE should also recommend the (equally questionable) works of Dawkins, Hitchens, Harris and so on. Either science can neither prove or disprove gods, or it can, the philosophical/methodological distinction cuts both ways. Genie, what’s up?
Now back to Dawkins. As we have seen, he claims that we would be better off being on the offensive against religionists, because we’ve got the evidence. Oh yes, and because Christopher Hitchens is a better rhetorician than Ann Coulter (though he doesn’t look half as good, unfortunately). The latter is certainly true, but to pick on Coulter is to stack the deck much too obviously on one’s side. The real problem is that, pace Dawkins, evidence has nothing to do with it, because this isn’t a scientific debate. Look, even the most outrageous version of young earth creationism cannot be scientifically falsified. Wanna try? Consider the following: if there is any obvious evidence of the fact that evolution has occurred, it ought to be the impressive and worldwide consistent fossil record. Moreover, using the geological column as a way to date events during the history of the earth predates Darwin (i.e., it was invented by creationists), and we keep discovering new intermediate fossils further documenting evolution every year.
But a staunch creationist will argue (I know this from personal experience) that god simply orchestrated the whole appearance of fossils and intermediate forms to test our faith. As stunning and nonsensical as this “theory” may be, it makes the creationist completely and utterly impervious to evidence: the more evidence you bring up, the more he feels validated in his faith, because faith is belief regardless or despite the evidence. Now Dawkins will say that these people are irrational ignoramuses, and they certainly are. But that misses the point entirely: the lowly creationist has just given the mighty evolutionist a humbling (if unconscious) lesson in philosophy by showing that evidence simply does not enter the debate. If evidence is out, then we are left with sheer rhetorical force. But there too, atheists are easily outmatched: Coulter notwithstanding, there are armies of professionally trained preachers out there who will trump Hitchens — in the eyes of their constituencies at least — even when the latter is perfectly sober. And the important keyword here is “constituency,” since these are the very same people that turn around and elect a creationist board of education, causing endless headaches to Scott and collaborators, headaches that are not in the least helped by Dawkins-style posturing.
And really, look at Dawkins’ prescription here. According to him we should be even more “contemptuous” than the religious fanatics are; we should “really hurt” with our “sharp barbs”; we “can’t lose” because truth is clearly on our side. One almost gets the feeling that if Dawkins had the resources of the Inquisition at his disposal he might just use them in the name of scientific Truth (a philosophical oxymoron, by the way). Thanks for the public relations disaster, Dick!
What are we to do, then? First, learning some good philosophy wouldn’t hurt the likes of Dawkins a bit. That way they would finally appreciate that Genie’s position is not just a matter of pragmatism, and it has nothing to do with intellectual cowardice. Second, and more importantly, we really need to turn to psychology and sociology, the sciences that tell us how and when people change their minds. If we want a cultural change, we need to understand how cultures change. And by the way, let us remember that scientists are most certainly not immune to the same problem of walking around with a mind a bit less open than one would hope. Dawkins may like to think that science is about free inquiry that inevitably leads to people accepting new discoveries and renouncing old ideas based on the weight of evidence and rationality. If so, he hasn’t practiced science in a while (indeed, he hasn’t). As physicist Max Plank aptly said: “A new scientific truth does not triumph by convincing its opponents and making them see the light, but rather because its opponents eventually die, and a new generation grows up that is familiar with it.” Analogously with creationism: changing minds is a painstaking, largely unrewarding, capillary job, which the National Center for Science Education does superbly. Dawkins & co. should simply get out of the way and let them do their work.
[Note: I became aware of this latest much ado about nothing debate through a fairly well balanced post by Paul Fidalgo at the DC Secularism Examiner, where you will find additional quotations from the various parties involved.]
One of the constantly bewildering aspects of living on planet Earth is the assumption that most human beings seem to make that faith (usually, but not necessarily, the religious variety) is a virtue. This bizarre attitude — just to add insult to injury — often comes coupled with the equally strange idea that somehow too much reason is bad for you. Why?
Faith means that one believes something regardless or even in spite of the evidence. This, I should think, is so irrational, and potentially so bad for one’s health, that educators and policy makers would be very worried at the prospect of a nation where faith was praised and encouraged. I mean, suppose I tell you that I have faith in my auto mechanic, but then you discover that the guy knows nothing about cars, can never get one fixed, and on top of that charges me thousands of dollars every time I see him. You would be outraged at him, possibly to the point of calling for legal action against the rascal, and you would pity me for being such a fool. Now substitute any of the words “Preacher,” “Pope,” “Imam,” or even “Guru” for mechanic in the above example, change the care of my car to the care of my soul (whatever that is), and suddenly you get the phenomenon of strong social and legal defense of the concept of organized religion. How nut is that?
But Massimo, people usually ask me whenever the f-word is brought up, don’t you have faith in anything? Nope, I say, a denial that is immediately met with both bewilderment and commiseration. Don’t I have faith in my wife, for example? No, I trust her because I know her and know that she loves me. What about faith in humanity, considering that I profess to be a secular humanist? No, I have hope for the human lot, and even that is seriously tempered by my awareness of its less than stellar record throughout history.
Ah, but I believe in evolution, don’t I? Yes, I do, but notice the switch between “faith” and “belief,” two words that don’t necessarily mean the same thing at all. A belief is something one thinks is true, but beliefs — unlike faith — can be held in proportion to the available evidence and reasons in their favor. I “believe” in evolution because the evidence is overwhelming. I don’t have faith in evolution.
Okay, then, the irrepressible defender of faith might say, what about your acceptance of things you cannot possibly prove, either logically or empirically, such as that there is a physical world out there (instead of the universe being a simulation in someone’s mind)? Isn’t that faith? Nope, it’s a reasonable assumption that I adopt for purely pragmatic reasons, because it seems that if one rejects it apparently bad things will happen to him (like smashing his brains on the ground while believing that he can fly off of a skyscraper).
The exasperated faithful will then conclude that my life must be devoid of emotions, and that I am — once again — deserving of pity and commiseration more than anything else. But of course this is yet another common confusion that doesn’t hold up to scrutiny: my life is as emotionally rich as anyone else’s, I think, in accordance with both philosopher David Hume’s and neurobiologist Antonio Damasio’s conclusion that a healthy human existence requires a balance between reason and emotion. Without reason, we would not have been able to build our complex civilization; but without emotion we wouldn’t have given a damn about accomplishing anything at all. Still, while faith is obviously emotional, it is not a synonym of emotion; the latter is necessary, the former is parasitic on it.
What about this insane idea that somehow we live in a hyper-rational society which is already too burdened by the triumph of reason? If we are, it is hard to distinguish such society from a hyper-irrational one dominated by faith. This conceit that too much reason is bad is a leftover from the Romantic reaction to the Enlightenment, the so-called “age of reason” (which lasted much too briefly, and during which time reason was heard, but hardly dominated human affairs). If one wants to have a good measure of how little reason plays into our society, one only has to listen for a day to what most of our politicians say, or to what most of our journalists write, not to mention of course the often surprisingly frightening experience of simply overhearing people’s conversations on the subway or at work.
We are frequently told with a certain degree of smugness that we need to go “beyond reason,” even though that phrase is uttered by people who likely wouldn’t be able to pass logic 101. Now, this isn’t to say that reason is boundless, much less that it is a guarantor of truth. Reason is a tool, fashioned by natural selection to deal with largely mundane problems of survival and reproduction in a specific type of physical and social environment. But it seems to work pretty darn well even when it comes to proving complex mathematical theorems, constructing excellent hypotheses about how the universe got started, and even providing us with decent guidance on how to conduct human affairs while maximizing justice and minimizing killings — at least in theory!
Faith doesn’t bring us beyond reason, as amply shown by the fact that not a single problem — be it scientific, philosophical or socio-political — has ever been solved or even mildly ameliorated by faith. On the contrary, faith has a nasty tendency to make bumbling simpletons of us, to waste our energies, time and resources on pursuit that do not improve the human condition, and at its worst it convinces people to drive planes into skyscrapers, or to mount “holy” crusades to slaughter the “infidel.” Faith is not a virtue, it is a repudiation of one the few good things human beings have going for them: a little bit of reason.
Academia is notoriously resistant to change, which to some extent is a good thing. It was therefore no surprise that when Wikipedia became a phenomenon most academics scoffed at it as a passing fad, fatally flawed by its very core idea: anybody, and I mean anybody, can become a Wiki author and post new entries or edit existing ones. Surely, this will inevitably lead to chaos and complete unreliability, the critics said. But a few years ago a study of a sample of entries compared the accuracy of Wikipedia with that of the unquestionably prestigious Encyclopedia Britannica, and Wikipedia was at least as accurate, in some cases more.
Of course the “open access” model does have its limits and defects, and even Wikipedia has to maintain a certain amount of vigilance and label particular entries as contentious or unreliable if there is too much traffic and a lot of editing and counter-editing (typically concerning political issues or individual politicians). Still, from apparent chaos the system has allowed for the emergence of a reasonably reliable first-look reference source that truly exploits the power of the internet.
It seems that the next case will come from another sacred cow of academia: peer review. This is the system used by modern academics — both in the sciences and the humanities — to evaluate a scholarly paper before it is published, the chief gateway to insure the high quality of a publication, be it in philosophy, literary criticism, medicine, physics, or what have you. The way it usually works is that an author submits a paper for consideration to the editor of a journal in the appropriate field. The editor makes a first assessment of the manuscript and, if deemed suitable to the journal, sends it out to two or more reviewers, chosen from among people actively engaged in research and scholarship in the field addressed by the submitted paper.
A certain amount of time later (an amount of time that can be irritatingly long for the authors), the reviews come back with a thumbs up or down verdict, usually accompanied by (anonymous, and sometimes nasty) comments for the authors — so that they may revise the original manuscript and send it back to either the same journal (if so invited) or to another one. The process repeats itself until either the paper finds its way into a publication or is forever abandoned on the heap of wasted efforts.
The peer review system has its obvious advantages as a gatekeeper for academic publishing quality, but it has equally obvious drawbacks. First of all, the number of reviewers is fairly small, which means that the comments the authors receive may be reflective of the idiosyncratic views of those individuals, and may not necessarily constitute a good assessment of the general value of the paper. Second, often (though not always) the authors don’t know who the reviewers are, but the converse is not true, which leads to the temptation of stabbing a rival (or a rival’s student) in the back.
One can argue that the real peer review actually takes place over a period of years after the paper (or book) has been published, and it is the result of how, in the long-term, the community at large values the scholarship of the authors. Some papers and books are cited often, some become classics in their field, most are never heard of again — which in itself is not necessarily an indication of poor quality, but may be a simple reflection of the fact that too many people publish too much.
What I will call the classic peer review system, the one that relies on a small number of editor-selected referees, however, is increasingly under challenge. In the physics community, for instance, it has been normal practice for years to post pre-publication versions of one’s paper on internet servers, to get feedback from the rest of the community before formal submission. People can now refer others to these pre-prints by hyperlinks, almost as if they were actual publications, thereby blurring the distinction between formal and informal scholarship. Moreover, an increasing number of open access journals now encourages readers’ comments and even rankings to be posted for each paper, occasionally allowing authors to respond and engage in an open dialogue with the community.
This is, I think, a trend that is here to stay, and that will likely completely change the meaning and practice of academic research over the next decade or so. Still, perhaps the most spectacular — if somewhat under-reported — case of open peer review showed how the blogosphere can be a more effective guardian of scholarship than a small number of overworked editors and reviewers.
What happened was that two people affiliated with Inje University in Korea, Mohamad Warda and Jin Han, submitted a paper to the prestigious journal Proteomics. The paper was entitled “Mitochondria, the missing link between body and soul: Proteomic prospective evidence,” something that should have alerted the Editor, Michael Dunn, and the reviewers that something was amiss (a proteomic paper on dualism and the question of the soul?). Warda and Han’s review of the literature was meant as a criticism of the currently accepted theory that the mitochondria (the cellular organelles that are involved in the production of the energy that keeps the metabolism of the organism going) are the result of an evolutionary endosymbiotic event; in other words, that they originated from the engulfment of a bacterial cell by an ancestor of modern plants, animals and fungi.
Warda and Han wrote: “Alternatively, instead of sinking into a swamp of endless debates about the evolution of mitochondria, it is better to come up with a unified assumption. … More logically, the points that show proteomics overlapping between different forms of life are more likely to be interpreted as a reflection of a single common fingerprint initiated by a mighty creator than relying on a single cell that is, in a doubtful way, surprisingly originating all other kinds of life.”
It is difficult to make sense of the badly written phrase (no language editors at Proteomics?), but surely the reviewers should have been a bit surprised by the obviously unscientific phrase “a mighty creator.” Regardless of whether one thinks that concepts like soul and divine creators make any sense at all (I don’t), they surely do not belong to an ostensibly scientific paper. I am not at all suggesting that Dunn or his reviewers are intelligent design creationists: they simply missed the supernatural references, presumably because they were too busy and distracted by the mountain of very technical language surrounding that specific phrase (though how they missed the title is a bit more difficult to rationalize away).
The happy ending to the story is the result of the normal practice that Proteomics has, together as do many other journals, of posting papers on their web site before they are actually printed. According to an article in the National Center for Science Education Reports, the first to note the oddity of Warda and Han’s paper was Steven Salzberg, a professor of computer science at the University of Maryland, who blogged about it. That led to blog posts by Attila Cordas, Lars Juhl Jensen and PZ Myers, and eventually to the editor of Proteomics requesting a withdrawal of the paper by the authors, who complied.
Interestingly, the request to withdraw was not based on the creationist claim, but on the fact that the bloggers had uncovered another problem with the paper that had escaped reviewer and referees: the entire body of the article by Warda and Han had been plagiarized from other, already published, sources! Apparently, their only original contributions were writing in really awful English and references to the soul and the mighty creator.
The moral of the story is that the much maligned blogosphere (“you know, anybody can write whatever they want, and nobody’s checking”) in this case clearly surpassed the official, academically sanctioned system of peer review. My hunch is that this isn’t going to be the last time this happens, and that we are looking at the dawn of a new era of academic practice, when papers will be scrutinized by thousands of reviewers within a matter of hours of publication. If we can harness this tremendous intellectual power in a reasonably ordered fashion, we will make the next leap toward a truly worldwide community of scholars and authors.
This is the year of Darwin (yes, yes, it’s also the year of astronomy, I know), and especially this week — around the date of Chuck’s birth — we are seeing a spike of events, radio and tv pieces, and printed articles. (Expect a second peak in November, for the anniversary of the publication of Origin of Species.) One of the most schizophrenic treatments of the topic surely is the one published this week by Forbes magazine. They have a number of solid pieces by recognized scientists and science writers (for instance by evo-devo researcher Sean Carroll, philosopher Michael Ruse, and writer Michael Shermer). But they also have four, I repeat four, insanely anti-intellectual articles by pro-ID writers: Ken Ham (the “CEO” of Answers in Genesis and founder of the oxymoronic Creation Museum in Kentucky), John West (the hack author of Darwin Day in America), Jonathan Wells (the infamous author of Icons of Evolution), and my colleague here at Stony Brook, Neurosurgery Vice Chairman Michael Egnor. I will ignore the first three because I have dealt with them on numerous occasions in the past, and concentrate instead on Egnor.
He begins his piece by stating that “As an undergraduate biochemistry major, I was uncomfortable with
Darwinian explanations for biological complexity. Living things certainly appeared to be designed.” That’s a bad enough reflection on undergraduate science education in the United States at the time (alas, it ain’t much better today, in this respect), but the fact that Egnor persists in such a naive way of thinking today, as a professor of neurosurgery is really a shame (for him and for Stony Brook).
Egnor goes on trotting out the same old tired creationist “objections” to evolution. The fossil record has discontinuities (yes, it does, and they have been shown over and over to be perfectly compatible with evolution, considering the time scales involved); biomolecules are so complex that they couldn’t possibly have originated naturally (an argument from ignorance, both in the philosophical sense and in the personal sense that Egnor is obviously ignorant about molecular evolution); the genetic “code” couldn’t exist without design, because only intelligent beings produce codes (an astounding example of taking a metaphor literally instead of looking at the perfectly explicable biochemistry of nucleic acids). Then Egnor proceeds by asking what he seems to think are devastating questions for “evolutionists.” Let’s take a look.
“Why do Darwinists claim that intelligent design theory isn’t scientific, when both intelligent design and Darwinism are merely the affirmative and negative answers to the same scientific question: Is there evidence for teleology in biology?” This betrays Egnor’s ignorance of the nature of science. The question of teleology in biology is most certainly not a scientific question, it is a philosophical one. And “Darwinism” is not a negative answer to that question, it is a positive answer to the question of how adaptive complexity originated during the history of life on earth.
“Why do Darwinists–scientists–seek recourse in federal courts to silence criticism of their theory in public schools?” Because the issue is one of government-mandated separation of Church and State and school board-regulated criteria for what should be taught in science classrooms. The creation-evolution debate is not a scientific debate, it is a social controversy, and as such it naturally, if unfortunately, involves court challenges.
“What is it about the Darwinian understanding of biological origins that is so fragile that it will not withstand scrutiny by schoolchildren?” Are you kidding? Schoolchildren do not understand plenty of other solidly established science either. For instance, many children (and a good number of adults) seem to think of the world in terms of Aristotelian, not Newtonian (let alone relativistic) physics. Should we ban Sir Isaac from science curriculum as a result?
Egnor ends his piece with a long whine about how he has been vilified on the internet (well, join the club, dude), and how “fundamentalist atheists” have called for him to be fired. I don’t know how good a neurosurgeon Egnor is, but I assume he is good enough to have obtained his post at Stony Brook. As such, he should retain it. But if he were in my Department (Ecology & Evolution) I most certainly would call for him to be booted out immediately on the ground that he doesn’t understand the basic foundations of the science in which he is supposed to carry out scholarship and which he should be able to teach to students.
This isn’t a matter of “ostracism” or “intolerance” (rather ironic terms when they come from creationists), it is a matter of intellectual honesty. I don’t subscribe to the Dawkins-style attack on creationists (amply quoted by Egnor, of course), which he calls “ignorant, stupid, insane … or wicked.” Most creationists are none of the last three (though ignorance often does play a role. But then again, I’m just as ignorant of neurosurgery). But Egnor, Ham, Wells, West and especially the editors of Forbes should understand once and for all that evolution is to biology what relativity or quantum mechanics are to physics, what the big bang is to cosmology, or what the atomic theory is to chemistry. Evolution is a scientific fact as solid as they come, and a scientific theory as well established as any other scientific theory is. Creationism and its cousin intelligent design are primitive ideas that were reasonable enough in a pre-scientific society, but do not have a respectable place at the table of intellectual discourse anymore. It’s time to get used to it.
I just got back from a trip to Las Vegas, where the highlight was attending a Penn & Teller show. They are the magicians who have an entire tv series devoted to debunking the paranormal, appropriately called Bullshit! As a skeptic, one of the most annoying questions I get (and I’m sure P&T do also) is “why spoil other people’s beliefs? What’s the harm? Why are you so cynical?” (Note: skepticism is most emphatically not the same thing as cynicism, either in English meaning or in terms of the original Greek philosophical traditions.)
Well, ask the young woman that a couple of weeks ago was seized by some of her neighbors in Papua New Guinea, stripped naked, bound, gagged, and set on fire on suspicion of being a witch. She died a horrible and senseless death. This is not an isolated case in that part of the world (or in Africa). According to the local police more than 50 people were killed in the past year in two Papua New Guinea provinces because they were suspected of practicing sorcery. Anthropologist Bruce Knauft of Emory University has conducted a study according to which over the past four decades local families have seen a full one third of their adults killed violently, 90% of the deaths being connected to superstitious beliefs about witchcraft and the like.
Papua New Guinea is one of four Asian countries afflicted by an AIDS epidemics, but many villagers think it is witches, not the HIV virus, that spreads the disease (again, the same position held by many people, and even some governments, in Africa). Superstition is an easy “explanation” when the reality is either too difficult to comprehend or too hard to accept, but people are literally dying as a result of it.
But that’s the third world, right? Yes, but does witchcraft really sound that different from the practice of, say, snake handlers and speakers in tongues, right here in the good old U.S of A.? Do you remember Sarah Palin saved by a witch doctor? Moreover, plenty of people in the Western world die or get ill because they take homeopathic “remedies” (i.e., water and sugar) for treating serious conditions, for instance. And there is, of course, the psychological (and more often than not, financial) pain experienced by people whose grief and hopes are exploited by those who sell them instant Jesus cures, or tantalize them with the possibility of once again communicating with their loved ones.
That is why the work of the skeptic is not simply a matter of enjoying the intellectual challenge of exposing the frauds, or even the educational challenge of raising the world’s critical thinking abilities by a notch or two. It is work that helps reduce the exploitation of people’s fears for financial gain, power, or prestige. And it is work that may eventually save lives like the one of the innocent young woman who died in Papua New Guinea, yet another innocent victim of ignorance and stupidity.
No, this isn’t a headline from the Onion, it’s the latest turn in the “atheist buses” controversy in England. As you probably know, the British Humanist Association has endorsed an idea by comedian Ariane Sherine, who was annoyed by Christian advertisements on British public transport that threatened eternal damnation. Sherine thought it would be nice to give people a bit of metaphysical relief by writing on buses and subways that “There probably is no God, now stop worrying and enjoy your life.”
To Sherine’s utter surprise, her campaign quickly raised £140,000, which has made it possible to run the advertisement on 800 buses across England. Not at all unexpectedly, this has generated an angry response by some religionists, despite the fact that church attendance in that country is one of the lowest in the world. And here is the kicker: Christian campaigner Stephen Green and others have actually filed formal complaints with the British Advertising Standards Authority (ASA) on the ground that the atheists are violating “guidelines on taste and decency.” According to Green “If you’re going to put out what appears to be a factual statement then you have to be able to back it up. They’ve got to substantiate this proposition that in all probability, God doesn’t exist.”
Oh really? Talk about a spectacular example of the pot calling the kettle black! Let me get this straight: a statement that supernatural entities probably do not exist is, in Green’s and his loony friends’ mind, less obviously substantiated than a statement that there is such a thing as everlasting punishment in hell? To put it another, perfectly parallel, way: claiming that Santa Clause (probably) doesn’t exist would also be less “tasteful, decent, and factual” than to claim that he really does deliver presents to the world’s (Christian) children every 24th of December. If you think I’m joking, you should read the excellent “Santa Lives! Five Conclusive Arguments for the Existence of Santa Claus” by Ellis Weiner (the arguments are: ontological, causal, from design, experiential, and moral — sounds familiar?).
Now this hilarious insanity has put the ASA in the rather awkward position of having to rule on a long standing metaphysical dispute. If the agency lets the atheist campaign go on, it will implicitly be saying to the British public that it is in fact reasonable to state that god probably doesn’t exist; if, on the other hand, Sherine’s and the British humanists are found to be at fault, the ASA would in effect taking the position that there is sufficient evidence for the existence of hell, so that Christian groups are not violating its advertising standards. Philosophers and theologians the world over will surely be following this one with utmost interest!
By the way, I have to note that the only atheist who has (partially) objected to the campaign is our good old lovable curmudgeon, Richard Dawkins. He doesn’t like the word “probably” in the ad. This is because Dawkins, as I have pointed out before, insists on maintaining the indefensible position that science can disprove the existence of (all) gods, though he is a bit wishy washy about this even in his “The God Delusion”, where he says both that he is not absolutely certain of god’s nonexistence and that science can disprove such a ridiculous notion anyway. The reality is that science cannot disprove the supernatural, but that a philosophical argument informed by sound science can, in fact, reduce the likelihood of the supernatural to the very, very improbable indeed. That’s why Sherine and the British Humanist Association got it exactly right in the wording of their campaign. So now go on and enjoy your day without fear, there (probably) is no hell.