Wednesday, February 27, 2013

Why it is so hard to distinguish science from pseudo-science

Fifteen hundred years before the birth of Christ, a chunk of stuff blew off the planet Jupiter. That chunk soon became an enormous comet, approaching Earth several times around the period of the exodus of the Jews from Egypt and Joshua’s siege of Jericho. The ensuing havoc included the momentary stopping and restarting of the Earth’s rotation; the introduction into its crust of organic chemicals (including a portion of the world’s petroleum reserves); the parting of the Red Sea, induced by a massive electrical discharge from the comet to Earth; showers of iron dust and edible carbohydrates falling from the comet’s tail, the first turning the waters red and the second nourishing the Israelites in the desert; and plagues of vermin, either infecting Earth from organisms carried in the comet’s tail or caused by the rapid multiplication of earthly toads and bugs induced by the scorching heat of cometary gases. Eventually, the comet settled down to a quieter life as the planet Venus, which, unlike the other planets, is an ingénue at just 3500 years old. Disturbed by the new girl in the neighbourhood, Mars too began behaving badly, closely encountering Earth several times between the eighth and seventh centuries BCE; triggering massive earthquakes, lava flows, tsunamis and atmospheric fire storms; causing the sudden extinction of many species (including the mammoth); shifting Earth’s spin axis and relocating the North Pole from Baffin Island to its present position; and abruptly changing the length of the terrestrial year from 360 to its present 365¼ days. There were also further shenanigans involving Saturn and Mercury.
If this story makes you feel even the slightest stab of recognition, you’re probably at least fifty years old, because it’s a summary of the key ideas in Immanuel Velikovsky’s Worlds in Collision. Published in New York in 1950, the book is now almost forgotten, but it was one of the greatest cultural sensations of the Cold War era. Before it was printed, it was trailed in magazines, and immediately shot onto the American bestseller lists, where it stayed for months, grabbing the attention and occupying the energies of both enthusiasts and enraged critics. The brouhaha subsided after a few years, but the so-called Velikovsky affair erupted with greater violence in the late 1960s and early 1970s, when the author gathered a gaggle of disciples and lectured charismatically (and at times incomprehensibly) to large and enraptured campus audiences. Velikovsky’s story was chewed over by philosophers and sociologists convinced of its absurdity, some trying to find standards through which one could securely establish the grounds of its obvious wrong-headedness, others edgily exploring the radical possibility that no such standards existed and reflecting on what that meant for so-called demarcation criteria between science and other forms of knowledge.
Worlds in Collision was Velikovsky’s blockbuster – I haven’t found exact sales figures, though there is an estimate from the late 1970s that millions of copies had been sold, with translations into many major languages – but there were follow-up volumes through the 1970s, fleshing out the basic astronomical-historical picture and offering ingeniously reflexive accounts of the developing controversies over his theories. By the late 1960s and 1970s, Velikovsky’s books must have been in most American college dorm rooms. Other countries were not nearly as besotted, but neither were they immune: in 1972, both the BBC and the Canadian Broadcasting Corporation produced respectful documentaries on the man and his views. Velikovskianism had gained so much traction in America that in 1974 there was a huge set-piece debate over his views at the annual meeting of the American Association for the Advancement of Science. His scientific opponents reckoned he was ‘quite out of his tree’, while some of his acolytes – and these included an assortment of scientists with appropriate credentials – were of the opinion that Velikovsky was ‘perhaps the greatest brain that our race has produced’.
Velikovsky appeared in American culture pretty much as a man from Mars: he was almost unknown to the intellectual communities whose expertise his book most directly engaged. Born in Vitebsk (now in Belarus) in 1895 to well-off Jewish parents, Velikovsky studied a wide range of subjects at Montpellier and Edinburgh before taking his medical degree in Moscow in 1921. Emigrating to Berlin, then to Vienna and Palestine, he learned psychoanalysis under Freud’s pupil Wilhelm Stekel and practised as a psychiatrist before escaping the Nazis in 1939 and living first on the Upper West Side of Manhattan and later in Princeton. From that point on, he never had an academic appointment or regular salaried employment, apparently supporting himself with money inherited from his father, a bit of practice as a shrink and, later, book royalties and fees for speaking engagements. When you look up Velikovsky online, he’s most often described as a psychiatrist or psychoanalyst. Despite the celebrity of his astronomical-historical stories, and despite the fact that he almost entirely gave up the couch when he became a celebrity, that’s quite right.
Between the wars, Velikovsky turned himself into one of those then common Central European scholars of enormous intellectual range, always seeking the Big Unifying Idea. His interest in planetary astronomy was a late development: it was psychoanalysis and Jewish history that were the keys to the story in Worlds in Collision. A Zionist, though not notably religious, Velikovsky was infuriated by Freud’s last book, Moses and Monotheism (1937), which claimed that Moses wasn’t actually Jewish but a runaway Egyptian priest from Pharaoh Akhenaten’s monotheistic sun religion, later murdered by the Israelites, who ended up fabricating a syncretic deity from an Egyptian sun god and a Midianite volcano god called Jehovah. Subsequently, the idea of a Messiah was concocted as an expression of guilt for father-murder, a sense of guilt which has been handed down to the Jews as a common psychological inheritance. The historical account Judaism offered of itself was therefore, according to Freud, a form of dream-work, a collective repressed memory needing skilled decoding by modern interpreters. To Velikovsky, all this was yet another manifestation of Freud’s Jewish self-hatred. How dare he impugn the Old Testament story about who the Jews were and how they came to be Chosen? But Freud’s methods in Moses and Monotheism nevertheless signalled a productive new way of interpreting human history, one in which psychoanalytic techniques could effectively expose the true meaning of the world’s dream-myths.
At the same time, Velikovsky was convinced that the Old Testament, decoded in this way, was an overwhelmingly reliable historical account, that the Jewish records could be used as a standard to calibrate archives of dream-myths – from the Egyptian and Greek to the Chinese and Choctaw – and that, once this radical reinterpretation of world religions was achieved, we would have an accurate account of the physical events that had occurred in historical times and were encrypted in the dream-myths.
Although Worlds in Collision was a pastiche of comparative mythology and planetary astronomy, its major purpose was a radical reconstruction of history. Velikovsky had worked through the annals of myth and ancient history, which substantially supported each other and told the same historical stories; the Jewish story and its chronologies could be used reliably to gauge all the others. The apparent datings of events did differ, but a wholesale recalibration of ancient chronology was both possible and necessary. The ancient historians had got their dates badly wrong, and so too had the astronomers, biologists and geologists, who now needed to understand that spectacular cosmic catastrophes had happened and that historical methods of interpreting ancient texts could be used to establish radically unorthodox scientific stories. Properly understood, Jewish history not only laid bare the inaccuracy of scientific accounts, it securely established the reality of natural events and processes which scientists assumed could not possibly have happened.
It was American scientists who went ballistic over Velikovsky, not historians, and one purpose of Michael Gordin’s probing and intelligent The Pseudoscience Wars is to ask why they responded to Velikovsky as they did. Putting that sort of question is a sign of changed times. Passions have cooled; circumstances have altered. Almost all previous books about Velikovsky and the affair have been for or against, celebratory or accusatory, justifying the way the scientific community handled the business or criticising them for handling it badly. There’s no evidence that Gordin considers Velikovsky’s theories anything but nutty, yet affirming and identifying their nuttiness is a non-barking dog here. Gordin is a disengaged and dispassionate historian of science – much of his work has been about Russian science and the science and politics of nuclear weapons in the postwar period – and the questions he poses about Velikovsky are meant to illuminate the condition of American science in ‘the postwar public sphere’ and to figure out what has been meant by the notion of ‘pseudoscience’. The Velikovsky affair was at once a long-running episode of surpassing strangeness and, Gordin says, ‘ground zero’ in a series of Cold War era ‘pseudoscience wars’. Understanding the pathological is here meant to encourage a new perspective on the normal.
Scientists in the years after World War Two were upset by Velikovsky because, Gordin argues, they felt insecure, uncertain of the new authority and influence they had apparently gained by building the bomb and winning the war. Enormous amounts of government money had been dumped on them and government agencies designed to ensure the support of even basic research had been established, with unprecedented arrangements allowing the recipients of government largesse to determine its distribution. Yet there were reasons to be fearful, and in Cold War American culture there was more than enough fear to go around. Some forms of fear specially afflicted scientists. First, there was concern that political support might translate into political control. There were the Marxists – not all that many, of course, in America – who had actively worked for the organised planning and direction of scientific research, and there was the cautionary tale of genetics in the Soviet Union, especially after 1948, when Stalin had decreed, against the canons of ‘Western bourgeois’ Mendelian genetics, that the ideas of the charlatan Ukrainian agronomist Trofim Lysenko about the inheritance of acquired characteristics should count as dogma. Lysenkoism seemed to show how vulnerable orthodox science might be to the fantasies and ideologies of those who weren’t scientists at all or whose scientific credentials had been burnished by the political powers. And there were the McCarthyite witch-hunts, some of which targeted distinguished scientists. How much autonomy did American scientists actually have? How vulnerable was that autonomy to the dictates of politicians and to the delusions of popular culture? No one could be sure. In 1964, Richard Hofstadter brilliantly described the ‘paranoid style’ of American politics: your opponents weren’t simply wrong, they were conspiring against you, mobilising dark forces to suppress free and rational thought. The joining up of psychiatry and history was in the air, like the UN’s Black Helicopters over the US – or perhaps in the cultural water, like fluoride dumped in reservoirs by alien agents.
Velikovskianism belonged to the intellectual genre known as catastrophism, the notion that sudden and massive changes, not just gradual ones, have occurred in the natural world and that the more or less uniform natural processes now observable do not constitute all the modes of change that have historically shaped the world. Darwin was a notable uniformitarian, and Velikovsky opposed Darwinism for that reason, but there is nothing inherently unscientific about catastrophism, nor did Velikovsky’s catastrophism invoke divine intervention. It was bizarre, but it was offered as a scientific (not a religious) theory about natural objects, natural events and natural powers. At a theoretical level, the objections orthodox scientists had about Velikovskianism mostly had to do with celestial mechanisms: his assertions about the insufficiency of gravitation and inertial motion to account for planetary behaviour and related claims about the significance of electromagnetic forces. The problem at a factual level was that these spectacular catastrophes were supposed to have happened quite recently, while orthodox science recognised no evidence that they had.
The greatest ingenuity of Velikovsky’s thought lay in its merging of naturalistic catastrophism and psychoanalytic theory. This allowed him to account at once for the annals of comparative myth and religion and for scientists’ resistance to his scheme, and that is the reason Worlds in Collision was offered, in Gordin’s phrase, as ‘a dream journal for humanity’. The key was what Velikovsky called ‘collective amnesia’. The catastrophes let loose on Earth by the Venus comet had so scarred the human mind that memories of them had either been erased or, more consequentially, encoded in allegory. Just as with Oedipal father-killing and mother-mating, amnesia and suppressed memory were coping mechanisms, and so a proper interpretation of ancient myth would decode the allegorical forms into which traumatic memories had been cast. At the same time, what was the violence of scientists’ opposition to Velikovsky’s ideas but a persistence of that same tendency to deny the catastrophic truth of what had happened to the human race, how very close it had come to obliteration? The fact that the scientists were leagued against him was precisely what Velikovsky’s theories predicted. It was further evidence that he was right. What the scientists needed, indeed what the culture as a whole needed, was therapy, a cure for collective amnesia.
Here are the reasons for the enormous appeal of Velikovsky’s theories to Cold War America, and, specifically, to the young, the angry and the anxious. Lecturing to campus audiences, Velikovsky told the students what they already knew: the world was not an orderly or a safe place; Armageddon had happened and could happen again:
The belief that we are living in an orderly universe, that nothing happened to this Earth and the other planets since the beginning, that nothing will happen till the end, is a wishful thinking that fills the textbooks … And so it is only wishful thinking that we are living in a safe, never perturbed, solar system and a safe, never perturbed past.
Alfred Kazin, writing in the New Yorker, understood that this was part of Velikovsky’s appeal, and tellingly linked the great pseudoscientist with the Doomsday warnings of orthodox atomic scientists: Velikovsky’s work ‘plays right into the small talk about universal destruction that is all around us now’, he said, ‘and it emphasises the growing tendency in this country to believe that the physicists’ irresponsible scare warnings must be sound.’
The counterculture emerging in 1960s and 1970s America was born from fear and bred to hope. It feared nuclear catastrophe; it was disposed to think that the military-industrial-academic complex had scant regard for preventing catastrophe or even that it was conspiring to bring it about. (In 1962, the war-gamer Herman Kahn suggested that we should begin to ‘think the unthinkable’ and work out how to fight and win a nuclear war, and in 1964 Stanley Kubrick’s Dr Strangelove made Kahn’s vision box-office.) The counterculture expressed whatever optimism it had about the future in a characteristically American psychotherapeutic idiom. So did Velikovsky. Humankind could save itself if it confronted its irrationality and the collective amnesia that was responsible for all forms of racist, social and military violence: ‘Nothing is more important for the human race than to know our past and to face it.’ Velikovsky offered both diagnosis and treatment. And if his theories were not, in themselves, religious, they so clearly pointed to political and moral consequences that one disciple cited his Velikovskianism to the draft board as a way of getting out of the Vietnam War: pacifism flowed from planetary astronomy. (The reluctant soldier happily failed his physical, not his metaphysical.)
When Velikovsky’s bizarre story about planetary hi-jinks was so energetically puffed up in 1950, the American scientific establishment was presented with a choice, a choice endemically faced by orthodoxy confronted by intellectual challenges from alien sources: do you ignore the heterodox? Do you invite it to sit down with you and have a calm and rational debate? Do you crush it? There were scientific voices counselling Olympian disdain but they were in general overruled. Still, pretending to take no notice of Velikovsky might have been the plan had Worlds in Collision not been published by Macmillan, a leading producer of scientific textbooks, and packaged not as an offering to, say, comparative mythology or as popular entertainment, but as a contribution to science. Elite scientists, notably at Harvard, reckoned that they might be able to control what Macmillan published when it was represented as science. A letter-writing campaign was organised to get Macmillan to withdraw from its agreement to publish the book; credible threats were made to boycott Macmillan textbooks; hostile reviews were arranged; questions were raised about whether the book had been peer-reviewed (it had); and, when Worlds in Collision was published anyway, further (successful) pressure was exerted to make Macmillan wash its hands of the thing and shift copyright to another publisher. The editor who had handled the book was let go, and a scientist who provided a blurb and planned a New York planetarium show based on Velikovsky’s theories – admittedly not the sharpest knife in the scientific drawer – was forced out of his museum position and never had a scientific job again.
From an uncharitable point of view, this looked like a conspiracy, a conspiracy contrived by dark forces bent on the suppression of free thought and different perspectives – and the Velikovskians took just that view. An establishment conspiracy centred on Harvard had sought to control scientific thought; the conspirators had closed minds and wanted to close others’ minds; they refused to engage with Velikovsky’s ideas at the level of evidence, to show exactly where he was wrong. When Velikovsky made specific predictions of what further observation and experiment would show, his enemies declined to undertake those observations and experiments. This was the way the Commies behaved, Velikovsky’s allies suggested. Analogies were drawn from the history of science seen as the history of martyrs to dogma. Velikovsky figured himself as Galileo and his opponents as Galileo’s critics, who wouldn’t even look through the telescope to see the moons of Jupiter with their own eyes. ‘Perhaps in the entire history of science,’ Velikovsky said, ‘there was not a case of a similar violent reaction on the part of the scientific world towards a published work.’ Newsweek wrote about the spectacle of scientific ‘Professors as Suppressors’ and the Saturday Evening Post made sport of the establishment reaction as ‘one of the signal events of this year’s “silly season”’. Some scientists who were utterly convinced that Velikovsky’s views were loopy had qualms about how the scientific community had treated him. Einstein, in whose Princeton house Velikovsky was a frequent visitor, was one of them. Interviewed just before his death by the Harvard historian of science I.B. Cohen, Einstein said that Worlds in Collision ‘really isn’t a bad book. The only trouble with it is, it is crazy.’ Yet he thought, as Cohen put it, that ‘bringing pressure to bear on a publisher to suppress a book was an evil thing to do.’
The Velikovsky affair made clear that there were radically differing conceptions of the political and intellectual constitution of a legitimate scientific community, of what it was to make and evaluate scientific knowledge. One appealing notion was that science is and ought to be a democracy, willing to consider all factual and theoretical claims, regardless of who makes them and of how they stand with respect to canons of existing belief. Challenges to orthodoxy ought to be welcomed: after all, hadn’t science been born historically through such challenges and hadn’t it progressed by means of the continual creative destruction of dogma? This, of course, was Velikovsky’s view, and it was not an easy matter for scientists in the liberal West to deny the legitimacy of that picture of scientific life. (Wasn’t this the lesson that ought to be learned from the experience of science in Nazi Germany and Stalinist Russia?) Yet living according to such ideals was impossible – nothing could be accomplished if every apparently crazy idea were to be given careful consideration – and in 1962 Thomas Kuhn’s immensely influential Structure of Scientific Revolutions commended a general picture of science in which ‘dogma’ (daringly given that name) had an essential role in science and in which ‘normal science’ rightly proceeded not through its permeability to all sorts of ideas but through a socially enforced ‘narrowing of perception’. Scientists judged new ideas to be beyond the pale not because they didn’t conform to abstract ideas about scientific values or formal notions of scientific method, but because such claims, given what scientists securely knew about the world, were implausible. Planets just didn’t behave the way Velikovsky said they did; his celestial mechanics required electromagnetic forces which just didn’t exist; the tails of comets were just not the sorts of body that could dump oil and manna on Middle Eastern deserts. A Harvard astronomer blandly noted that ‘if Dr Velikovsky is right, the rest of us are crazy.’
By 1964, some of Velikovsky’s scientific critics were drawing a different lesson from the affair: the nuclear chemist Harold Urey was concerned ‘about the lack of control in scientific publication … Today anyone can publish anything,’ and it was impossible to tell the signal of truth from the noise of imposters. We must return to the past, Urey urged, when there was a proper intellectual class system and a proper system of quality control: ‘Science has always been aristocratic.’ In a society insisting on its democratic character, that was not a wildly popular position, though doubtless it had appealed to the scientists who tried to prevent the original publication of Velikovsky’s book and who sought to block his later efforts to publish in mainstream scientific journals.
Then there was the tactic of labelling Velikovskianism ‘pseudoscience’. One of the strengths of Gordin’s book is its careful historical unpicking of what scientists had in mind, and what they were doing, when they called something pseudoscientific. Pseudoscience isn’t bad science – incompetent, shallow, containing egregious errors of fact or reasoning. (In those senses, there’s a lot of bad science around which is almost never identified as pseudoscience.) Rather, what postwar scientists meant when they called Velikovskianism pseudoscience (along with contemporary parapsychology, resurgent eugenics, Wilhelm Reich’s orgone energy theory, creationism and the fantastical world ice theory) was that these were bodies of thought that pretended to be scientific, dressing themselves up in the costumes of science, but which were not the thing they pretended to be. Pseudoscientific thought might indeed contain errors of fact and theory, but the orthodox regarded it as fundamentally misconceived.
There were attempts to spell out in exactly what ways Velikovsky had transgressed the rules of scientific method, and, while some critics satisfied themselves that they had identified those errors, there was little if any agreement about what this transgressed method was. For example, Velikovsky did make a series of specific predictions (about the temperature and chemical composition of Venus and about Jupiter as a radio source) which would have permitted his system to be empirically tested, and some of these predictions were eventually advertised as confirmed (even in major scientific journals), but it proved notoriously difficult to disentangle those specific observations – whether supposedly confirming or refuting – from a complex network of claims and assumptions. This ‘network’ character of confirmation and disconfirmation is now generally recognised as endemic to science. Einstein spoke with his usual wisdom when asked how scientists might tell by inspection whether unorthodox ideas were brilliant or barmy. He replied, with Velikovsky clearly in mind: ‘There is no objective test.’ The term ‘pseudoscientist’ is a bit like ‘heretic’. To be a pseudoscientist is to be accused; you don’t describe yourself as a pseudoscientist. (Velikovsky, indeed, was exquisitely cautious about joining a salon des refusés, disinclined to associate his cause with that of the parapsychologists and members of the other pseudoscientific tribes who identified themselves as martyrs to orthodoxy.) So there was a lot of pseudoscience about in the Cold War decades, but the category – not the content – was manufactured by orthodox scientists concerned about maintaining the boundaries of legitimacy but unable to find a stable and coherent way of defining what the category consisted of, other than its violation of valued structures of plausibility.
If pseudosciences are not scientific, neither are they anti-scientific. They flatter science by elaborate rituals of imitation, rejecting many of the facts, theories and presumptions of orthodoxy while embracing what are celebrated as the essential characteristics of science. That is at once a basis for the wide cultural appeal of pseudoscience and an extreme difficulty for those wanting to show what’s wrong with it. Velikovsky advertised his work as, so to speak, more royalist than the king. Did authentic science have masses of references and citations? There they were in Worlds in Collision. Was science meant to aim at the greatest possible explanatory scope, trawling as many disciplines as necessary in search of unified understanding? What in orthodoxy could rival Velikovsky’s integrative vision? Authentic science made specific predictions of what further observation and experiment would show. Velikovsky did too. Was science ideally open to all claimants, subjecting itself to all factual criticisms and entertaining the possibility of radically new theoretical interpretations? Who behaved more scientifically – Velikovsky or the Harvard ‘suppressors’?
Gordin sides with those – like Einstein and a number of modern sociologists and philosophers – who doubt that universal and context-independent criteria can be found reliably to distinguish the scientific from the pseudoscientific. But here is a suggestion about how one might do something, however imperfectly, however vulnerable to counter-instances and however apparently paradoxical, to get a practical grip on the difference between the genuine article and the fake. Whenever the accusation of pseudoscience is made, or wherever it is anticipated, its targets commonly respond by making elaborate displays of how scientific they really are. Pushing the weird and the implausible, they bang on about scientific method, about intellectual openness and egalitarianism, about the vital importance of seriously inspecting all counter-instances and anomalies, about the value of continual scepticism, about the necessity of replicating absolutely every claim, about the lurking subjectivity of everybody else. Call this hyperscience, a claim to scientific status that conflates the PR of science with its rather more messy, complicated and less than ideal everyday realities and that takes the PR far more seriously than do its stuck-in-the-mud orthodox opponents. Beware of hyperscience. It can be a sign that something isn’t kosher. A rule of thumb for sound inference has always been that if it looks like a duck, swims like a duck and quacks like a duck, then it probably is a duck. But there’s a corollary: if it struts around the barnyard loudly protesting that it’s a duck, that it possesses the very essence of duckness, that it’s more authentically a duck than all those other orange-billed, web-footed, swimming fowl, then you’ve got a right to be suspicious: this duck may be a quack.

Fairness is an F-word

There is an opinion piece in the Chronicle Review about fairness. The author suggests that modern-day Americans think the opposite of fairness is selfishness, whereas it ought to be favoritism. The whole article is at http://chronicle.com/article/In-Defense-of-Favoritism/135610/. Here are some core paragraphs. I put one key sentence in bold.
--ww--

Children and parents were taught something very different about envy in the 19th century. Parents taught their children to accommodate negative feelings like envy using stoic resolve. When the educational philosopher Felix Adler analyzed the biblical Cain and Abel parable, in his 1892 The Moral Instruction of Children, he exhorted young people to master and suppress their feelings of envy, or else they would end up like murderous Cain (recall that envy led Cain to kill his brother after God preferentially favored Abel's animal sacrifice). Envy was to be treated with self-discipline. There will always be people better off than you, and the sooner you accept and conquer your envy, the better off you'll be.
The social historian Susan J. Matt argues that all this changed in the 20th century, and by the 1930s a whole new childhood education regarding envy was in full swing. Social workers "praised parents who bought extra gifts for their children. If a son or daughter needed a hat, adults should buy it, but they should also purchase hats for their other offspring, whether or not they needed them. This would prevent children from envying one another."
The phenomenon of sibling rivalry made its way into the textbooks as a potentially damaging pattern of envy—one that is best addressed by giving all the kids an equal fair share of everything. Subduing or restraining one's feelings of deprivation and envy was considered old school, and new parents (living in a more prosperous nation) sought to stave off those feelings in their children by giving them more stuff.
This trend—of assuaging feelings of deprivation by distributing equal goods to children—grew even stronger in the baby-boomer era and beyond. It has also dovetailed nicely with the rise of an American consumer culture that defines the good life in part by material acquisition. "In a consumer society," Ivan Illich says, "there are inevitably two kinds of slaves: the prisoners of addiction and the prisoners of envy." Today's culture tries to spare kids the pains of sibling and peer rivalry, but does so by teaching them to channel their envy into the language and expectation of fairness—and a reallocation of goods that promises to redress their emotional wounds.
If our high-minded notions of retributive justice have roots in the lower emotions of revenge, then why should we be surprised if fairness has roots in envy? I have no illusions and feel entirely comfortable with the idea that fairness has origins in baser emotions like envy. But most egalitarians will find this repugnant, and damaging to their saintly and selfless version of fairness.
The merit-based critique of fairness is well known. Plato spends much of The Republic railing against democracy on the grounds that know-nothing dolts should never have equal political voice with experts (aristoi). Elitism is a dirty word in our culture, but not for the ancients.
American hostility to elitism is especially manifest during election seasons, when politicians work hard to downplay their own intelligence and intellectual accomplishments so they might seem less threatening (less eggheadish) to the public. I am in agreement with many of the merit-based critiques of egalitarian fairness. I don't want my political leaders to be "regular guys." I want them to be elite in knowledge and wisdom. I want them to be exceptional.
Our contemporary hunger for equality can border on the comical. When my son came home from school with a fancy ribbon, I was filled with pride to discover that he had won a footrace. While I was heaping praise on him, he interrupted to correct me. "No, it wasn't just me," he explained. "We all won the race!" He impatiently educated me. He wasn't first or second or third—he couldn't even remember what place he took. Everyone who ran the race was told that they had won and were all given the same ribbon. "Well, you can't all win a race," I explained to him, ever-supportive father that I am. "That doesn't even make sense." He simply held up his purple ribbon and raised his eyebrows at me, as if to say, "You are thus refuted."
I don't want my son and every other kid in his class to be told they'd "won" the footrace at school just because we think their self-esteem can't handle the truth. Equal rewards for unequal achievements foster the dogma of fairness, but they don't improve my son or the other students.
The contrast of our fairness system with merit-based Chinese preschool is astounding. Imagine your 4-year-old preschooler getting up the nerve to stand in front of her class to tell a story. It's a sweet rite of passage that many children enjoy around the world, and it builds self-esteem and confidence. Now imagine that when your preschooler is finished spinning her yarn, the other children tell her that her story was way too boring. One kid points out that he couldn't understand it, another kid says her voice was much too quiet, another says she paused too many times, and another tells her that her story had a terrible ending. In most schools around the world, this scenario would produce a traumatic and tearful episode, but not so in China, where collective criticism is par for the course—even in preschool.
At Daguan Elementary School, in Kunming, China, this daily gantlet is called the "Story Teller King." American teachers who saw this exercise were horrified by it. But it is indicative of Chinese merit-based culture.
# # #

Matternet

Look! Up in the sky! It's a bird, it's a plane, no, it's matternet.

From this week's Economist:

THE spread of mobile phones in developing countries in the past decade has delivered enormous social and economic benefits. By providing a substitute for travel, phones can make up for bad roads and poor transport infrastructure, helping traders find better prices and boosting entrepreneurship. But although information can be delivered by phone—and, in a growing number of countries, money transferred as well—there are some things that must be delivered physically. For small items that are needed urgently, such as medicines, why not use drone helicopters to deliver them, bypassing the need for roads altogether?
That, at least, was the idea cooked up last year at Singularity University, a Silicon Valley summer school where eager entrepreneurs gather in the hope of solving humanity’s grandest challenges with new technologies. The plan is to build a network of autonomously controlled, multi-rotor unmanned aerial vehicles (UAVs) to carry small packages of a standardised size. Rather than having a drone carry each package directly from sender to recipient, which could involve a long journey beyond the drone’s flying range, the idea is to build a network of base stations, each no more than 10km (6 miles) from the next, with drones carrying packages between them.
After arrival at a station, a drone would swap its depleted battery pack for a fully charged one before proceeding to the next station. The routing of drones and the allocation of specific packages to specific drones would all be handled automatically, and deliveries would thus be possible over a wide area using a series of hops. It is, in short, a physical implementation of the “packet switching” model that directs data across the internet, which is why its creators call their scheme the “matternet”.
Over the matternet, so the vision goes, hospitals could send urgent medicines to remote clinics more quickly than they could via roads, and blood samples could be sent and returned within hours. A farmer could place an order for a new tractor part by text message and pay for it via mobile money-transfer. A supplier many miles away would then take the part to the local matternet station for airborne dispatch via drone.
Mind over matter
Andreas Raptopoulos, the entrepreneur who led the academic team, reckons that the scheme would be competitive with building all-weather roads. A case study of the Maseru district of Lesotho put the cost of a network of 50 base-stations and 150 drones at $900,000, compared with $1m for a 2km, one-lane road. The advantage of roads, however, is that they can carry heavy goods and people, whereas matternet drones would be limited to payloads of 2kg in a standard 10-litre container. But the scheme is potentially lifesaving in remote areas, and might also have commercial potential to deliver small packages in the rich world.
Since the original proposal, however, an ideological disagreement has emerged over how best to implement this drone-powered internet for objects. Two separate groups are now taking rather different approaches. The first, led by Mr Raptopoulos, has formed a company, called Matternet, to develop the drone and base-station hardware, and the software that will co-ordinate them. The company then hopes to sell the technology to government health departments and non-profit groups. Just as mobile phones have spurred development in poor countries, Mr Raptopoulos hopes drone delivery will do something similar.
The second group is called Aria (“autonomous roadless intelligent array”). It believes the matternet should be free, open and based on standardised protocols, just like the internet. It is developing these protocols and building prototypes that adhere to them, and inviting others to follow suit. Aria is not promoting any particular use of the technology, and will not necessarily build or run networks itself. “We understand there will be hundreds of applications, but we are not interested in running such applications,” says Arturo Pelayo, Aria’s co-founder. “We won’t aim for understanding every single geographical and cultural context where the system might be used.”
Both groups have recently started testing their first prototypes. Matternet ran a series of successful field tests of its prototype UAVs in the Dominican Republic and Haiti in September, and met local groups to sell the idea. Meanwhile, Aria also spent the summer testing, and showcased its ideas, such as the use of retrofitted shipping containers for base stations, at the Burning Man festival held in the Nevada desert in August. Flying drones in high winds without crashing into anyone presented quite a challenge.
For the delivery of drugs in developing countries, a rider on a motorbike may be a much simpler and more rugged solution. Maintaining a network of drones—a complex, immature technology—is unlikely to be easy, particularly in the remote areas that Matternet intends to target. It may be that congested city centres in rich countries will prove a more promising market.
And whether in the rich or poor world, any widespread deployment of delivery-drone fleets is bound to raise concerns about safety and regulation. It is undoubtedly a clever idea. But moving packets of data around inside the predictable environment of a computer network is one thing; moving objects around in the real world is, you might say, a very different matter.
# # #

Game over, Tech wins

I ran across an interesting article in Eurozine (http://www.eurozine.com/articles/2012-11-16-vargasllosa-en.html) in which sociologist Gilles Lipovetsky debates Nobel laureate Mario Vargas Llosa on points from Llosa's new book "Civilization of the Spectacle". Since C. P. Snow, people have been debating his two cultures, science vs. the arts. The article declares that the battle has to be recast, because it has entered a phase in which technology is dominant. I'm not advocating that you read the whole article; you've heard all the arguments before. The bottom line is: technology has replaced fine arts as the force that elevates mankind. Something for technologists to ponder.
--ww--

Lipovetsky: "What was noble culture, high culture, for the Moderns? Culture represented the new absolute. As the Moderns began to develop scientific and democratic society, the German Romantics created a form of religion through art, whose mission was to contribute what neither religion nor science were providing, because science simply describes things. Art became something sacred. In the seventeenth and eighteenth centuries the poet – and artists in general – were those who showed the way, who said what religion was saying earlier.

When we observe what culture is in the world of consumption, in the world of the spectacle – what you aptly call the "civilization of the spectacle" – is precisely the collapse of that Romantic model. Culture becomes a unit of consumption. We're no longer waiting for culture to change life, change the world, as Rimbaud thought. That was the task of the poets, such as Baudelaire, who rejected the world of the utilitarian. They believed that high culture was what could change man, change life. Today, nobody can possibly believe that high culture is going to change the world. In fact, on that score it's the society of entertainment, of the spectacle, that's won. What we expect from culture is entertainment, a slightly elevated form of amusement; but what changes life today is basically capitalism, technology. And culture turns out to be the crowning glory of all this."

Paradigm shifts: how scientists really work

The Structure of Scientific Revolutions at Fifty

Fifty years ago, Thomas Kuhn, then a professor at the University of California, Berkeley, released a thin volume entitled The Structure of Scientific Revolutions. Kuhn challenged the traditional view of science as an accumulation of objective facts toward an ever more truthful understanding of nature. Instead, he argued, what scientists discover depends to a large extent on the sorts of questions they ask, which in turn depend in part on scientists’ philosophical commitments. Sometimes, the dominant scientific way of looking at the world becomes obviously riddled with problems; this can provoke radical and irreversible scientific revolutions that Kuhn dubbed “paradigm shifts” — introducing a term that has been much used and abused. Paradigm shifts interrupt the linear progression of knowledge by changing how scientists view the world, the questions they ask of it, and the tools they use to understand it. Since scientists’ worldview after a paradigm shift is so radically different from the one that came before, the two cannot be compared according to a mutual conception of reality. Kuhn concluded that the path of science through these revolutions is not necessarily toward truth but merely away from previous error.
Kuhn’s thesis has been hotly debated among historians and philosophers of science since it first appeared. The book and its disparate interpretations have given rise to ongoing disagreements over the nature of science, the possibility of progress, and the availability of truth. For some, Kuhn was a relativist, a prophet of postmodernism who considered truth a social construct built on the outlook of a community at a specific point in history. For others, Kuhn was an authoritarian whose work legitimized science as an elitist power structure. Still others considered him neither a relativist nor an authoritarian, but simply misunderstood. Kuhn’s work was ultimately an examination of the borders between the scientific and the metaphysical, and between the scientific community and society at large. As he discovered, these boundaries are not always clear. It behooves us to bear this in mind as we take the occasion of the fiftieth anniversary to revisit his book and the controversies surrounding it.
Thomas Samuel Kuhn was born in Cincinnati in 1922. He attended Harvard — where his father, a hydraulic engineer, had also studied — and earned a bachelor’s degree in physics in 1943. After graduating, he became a junior researcher on radar, first at Harvard and then in Europe at the U.S. Office of Scientific Research and Development (OSRD). It was in these jobs that he became close with James B. Conant, who served as both president of Harvard and the head of OSRD. After the war, Kuhn returned to academic life at Harvard, receiving a Ph.D. in physics in 1949, and continuing on to teach the history of science. But the Harvard faculty denied him tenure in 1956, after which he left for Berkeley, where he was eventually made a full professor of the history of science in 1961. He never returned to physics professionally. By 1964, he had made his way to Princeton, and ended his career at M.I.T. as a professor of philosophy, where he retired in 1991. But it was at Berkeley, in 1962, that Kuhn published the work that was to mark his career, and the course of inquiry in the philosophy of science, from that point on: The Structure of Scientific Revolutions.
The earliest seeds that would grow into Kuhn’s famous book were planted when he was a doctoral student in 1947. Conant tasked Kuhn with giving a series of lectures on seventeenth-century theories of mechanics. It was during the preparation of these lectures that Kuhn first began to develop his ideas. He sought to grasp exactly why Newton had discovered the laws of motion, and why it had taken mankind so long to do that, considering that Aristotle’s theories about motion had been so manifestly wrong. Moreover, Kuhn was confused about why Aristotle had been so wrong, when he had gotten much of biology and social science so right.
One summer day, it occurred to Kuhn rather suddenly that Aristotle had been operating from within a completely different framework of physics than the modern understanding. For Aristotle, the growing of a child into an adult was a similar process to that of a rock falling to the ground: each is moving toward its natural end, the place and state where it belongs. Contrary to Newtonian physics, Kuhn later explained in the preface to his 1977 collection The Essential Tension, “position itself was ... a quality in Aristotle’s physics, and a body that changed its position therefore remained the same body only in the problematic sense that the child is the individual it becomes. In a universe where qualities were primary, motion was necessarily a change-of-state rather than a state.” This idea germinated in Kuhn’s mind as he continued his doctoral work, and later formed part of the basis for The Structure of Scientific Revolutions.
The argument of Structure is not especially complicated. Kuhn held that the historical process of science is divided into three stages: a “normal” stage, followed by “crisis” and then “revolutionary” stages. The normal stage is characterized by a strong agreement among scientists on what is and is not scientific practice. In this stage, scientists largely agree on what are the questions that need answers. Indeed, only problems that are recognized as potentially having solutions are considered scientific. So it is in the normal stage that we see science progress not toward better questions but better answers. The beginning of this period is usually marked by a solution that serves as an example, a paradigm, for further research. (This is just one of many ways in which Kuhn uses the word “paradigm” in Structure.)
A crisis occurs when an existing theory involves so many unsolved puzzles, or “anomalies,” that its explanatory ability becomes questionable. Scientists begin to consider entirely new ways of examining the data, and there is a lack of consensus on which questions are important scientifically. Problems that had previously been left to other, non-scientific fields may now come into view as potentially scientific.
Eventually, a new exemplary solution emerges. This new solution will be “incommensurable” — another key term in Kuhn’s thesis — with the former paradigm, meaning not only that the two paradigms are mutually conflicting, but that they are asking different questions, and to some extent speaking different scientific languages. Such a revolution inaugurates a new period of normal science. Thus normal science can be understood as a period of “puzzle-solving” or “mopping-up” after the discovery or elucidation of a paradigm-shifting theory. The theory is applied in different contexts, using different variables, to fully flesh out its implications. But since every paradigm has its flaws, progress in normal science is always toward the point of another crisis.
Kuhn relies heavily on a “particularly famous case of paradigm change”: the sixteenth- and seventeenth-century debate over whether the sun goes around the earth or the earth around the sun. (This had been the subject of Kuhn’s previous book, The Copernican Revolution [1957].) Before Copernicus, Ptolemy conceived of a universe with the earth at its center. The celestial spheres wrapped around the earth like the layers of an onion, although how exactly they rested on each other so smoothly — the theory was that their natural motion in the ether was rotation — remained unknown. Ptolemy and his followers saw that the stars, the planets, the moon, and the sun all appeared to revolve in one direction around the earth in a regular order, and the exceptions — like the occasions when some planets seemed to move backwards in the sky — could be explained away. For over a thousand years, this was the dominant European conception of the universe. The model worked well for most of the questions that were asked of it; it could be used to predict future celestial movements, and as a practical matter, there was little reason to doubt it. In this “normal” stage of science, the mopping-up process was one of refining the data for more accurate predictions in the future.
But there will always be facts and circumstances any given theory cannot explain. “By the early sixteenth century,” Kuhn writes in Structure, “an increasing number of Europe’s best astronomers were recognizing that the astronomical paradigm was failing in application to its own traditional problems” — not to mention outside pressures related to calendar reform and growing medieval criticism of Aristotle. As the unexplainables began to mount, the Ptolemaic paradigm moved into a state of crisis. The Copernican Revolution was the result — a new theoretical framework that could incorporate the contradictory data into a coherent structure by putting the sun at the center of the cosmos. In Kuhn’s view, Copernicus and Galileo were on the tail end of the mopping-up era of Ptolemaic astronomy; Copernicus was not intentionally overthrowing the existing model, but the way he interpreted the data was simply inconsistent with an earth-centered universe. In spite of subsequent efforts by others, such as Tycho Brahe, to synthesize the two theories, they were incompatible.
If a paradigm is “destined to win its fight, the number and strength of the persuasive arguments in its favor will increase.” After a new theory is established, it attracts new supporters, often including younger scientists and perhaps the originating theorist’s students. Meanwhile, Kuhn writes, “those unwilling or unable to accommodate their work” to the new theory “have often simply stayed in the departments of philosophy from which so many of the special sciences have been spawned.” Older scientists have trouble adjusting to the new paradigm, in part because it puts their own work in doubt. Eventually, they are ignored. Kuhn quotes Max Planck, who famously wrote that “a new scientific truth does not triumph by convincing its opponents and making them see the light, but rather because its opponents eventually die, and a new generation grows up that is familiar with it.”
Over time, there again comes to be almost unanimous agreement on the validity of the predominant theory — it achieves paradigmatic status. Scientists tacitly assume agreement on the meanings of technical terms, and develop a shared and specialized technical vocabulary to facilitate data accumulation and organization. They establish journals dedicated to their scientific field, begin to cross-reference one another, and scrutinize each other’s work according to whether or not it conforms to the theory. Their students, likewise, learn to approach problems in the same way they do, much as an apprentice learns from a master. Normal science has resumed and the cycle begins anew.
It was important for Kuhn that his conception of the history and process of science was not the same as that of scientific progress. He maintained that the process of science was similar to biological evolution — not necessarily evolution toward anything, only away from previous error. In this way, Kuhn was rather skeptical about the idea of progress at all. This was the most controversial aspect of his thesis, the one that most concerned the contemporary critics of Structure, on the basis of which they accused — or celebrated — Kuhn as a champion of relativism. As University of Toronto philosophy professor Ian Hacking notes in an introductory essay prepended to the new fiftieth-anniversary edition of Structure, Kuhn’s notion that science moves away from previous error
seems to call in question the overarching notion of science as aiming at the truth about the universe. The thought that there is one and only one complete true account of everything is deep in the Western tradition.... In popular versions of Jewish, Christian, and Muslim cosmology, there is one true and complete account of everything, namely what God knows. (He knows about the death of the least sparrow.)
This image gets transposed to fundamental physics, many of whose practitioners, who might proudly proclaim themselves to be atheists, take for granted that there just is, waiting to be discovered, one full and complete account of nature. If you think that makes sense, then it offers itself as an ideal towards which the sciences are progressing. Hence Kuhn’s progress away from will seem totally misguided.
For Kuhn, a paradigm shift is fundamentally not a scientific but a philosophical change, because the incommensurability of paradigms means that there is no external stance from which one can be shown to be superior to another. Kuhn explains, “The men who called Copernicus mad because he proclaimed that the earth moved ... were not either just wrong or quite wrong. Part of what they meant by ‘earth’ was fixed position. Their earth, at least, could not be moved.” To say that the heliocentric model is true and that the geocentric model is false is to ignore the fact that the two models mean quite different things by the term “earth.”
But science has long been understood as a progressive accumulation of knowledge, not a mere shift from one worldview to another, like the gestalt shift between perceiving a duck or a rabbit in the famous diagram that Kuhn liked to use for illustration. And so Structure was received by many as a denial of the existence of absolute truth. If competing paradigms are both comprehensible, yet are incommensurable, can they not both be true? And if they are both true, who is to be the final arbiter of truth?
Many took Kuhn’s thesis to be a reduction of science to power struggles between competing views. Kuhn himself rejected this interpretation — although his attempts to do so sometimes ended up lending support in form to what they rejected in words: The physicist Freeman Dyson recounts in his 2006 book The Scientist as Rebel that he once attended a conference at which Kuhn’s disciples were repeating these exaggerated interpretations of his thesis, and “Kuhn interrupted them by shouting from the back of the hall with overwhelming volume, ‘One thing you people need to understand: I am not a Kuhnian.’”
Structure had taken on a life of its own. As Kuhn stated in a 1991 interview with science journalist John Horgan, “For Christ’s sake, if I had my choice of having written the book or not having written it, I would choose to have written it. But there have certainly been aspects involving considerable upset about the response to it.” As Hacking notes, a number of critics argued that the first edition was terribly vague. One reviewer in 1966 criticized Kuhn for using the word “paradigm” in twenty-one different senses in the book. Hacking also notes the strikingly ambivalent language that Kuhn often employs, using phrases like “we may want to say” and “[this] may make us wish to say” instead of offering assertions outright, leaving him open to criticism that he was unclear or hedging his argument.
Kuhn was also criticized for building a wall between basic science (that is, science conducted for its own sake) and applied science (that is, science aimed at achieving specific, often socially important, goals). Against Bacon’s dictum that the proper aim of science is “the relief of man’s estate,” Kuhn argued that scientists in the “normal” stage must ignore “socially important problems” and should instead just focus on solving puzzles within the paradigm. In other words, problems that must be solved to improve human life but cannot be solved by the methods of a given paradigm are a distraction from the work necessary during the “normal” phase of science. This suggests that scientists must cloister themselves, at least to an extent, in order to make progress within the confines of their paradigm. Moreover, as Steve Fuller, professor of sociology at the University of Warwick, notes in Thomas Kuhn: A Philosophical History for Our Times (2000), Kuhn felt that a paradigm should be “sheltered from relentless criticism in its early stages.” So not only can a paradigm “insulate the community” of scientists from the demands of society, in Kuhn’s words, but scientists must in turn insulate the paradigm from harsh criticism.
Kuhn was left having to do some “mopping up” of his own, which he attempted in the years after Structure was published. For example, in a 1973 lecture (collected in The Essential Tension), Kuhn sought to counter the charge that he was a relativist. He argued that some theories and paradigms are better than others, based on five rational criteria: accuracy, consistency, scope, simplicity, and fruitfulness. Much later, in the 1991 interview with Horgan, Kuhn insisted
that he did not mean to be condescending by using terms such as “mopping up” or “puzzle-solving” to describe what most scientists do. “It was meant to be descriptive.” He ruminated a bit. “Maybe I should have said more about the glories that result from puzzle solving, but I thought I was doing that.”
Continuity in a paradigm is not necessarily a bad thing, Kuhn explained in his later years; indeed, it enables scientists to organize the greater and greater amounts of knowledge that grow through the cumulative process of scientific inquiry.
Criticisms aside, whether Kuhn even deserves full credit for the ideas put forth in his seminal work has rightly been questioned. As early as the mid-1940s, the Hungarian-British scientist-philosopher Michael Polanyi had published very similar ideas about the significance of scientists’ personal commitments to a framework of beliefs and the role of learning by example in scientific training. As Kuhn later admitted, he became familiar with those works during his studies under Conant, and through a talk that Polanyi delivered and Kuhn attended in 1958. Polanyi’s most extensive work on the subject, Personal Knowledge, was published the same year. In the early 1960s, Kuhn explicitly described his own thought as closely aligned with that of Polanyi, but he did not mention his name in Structure, except for a brief footnote in the first edition and an additional mention in the 1970 second edition. When Polanyi struggled to receive recognition for his thoughts independently of Kuhn’s, Kuhn admitted in private correspondence that he might owe “a major debt” to the older scholar. But shortly before Kuhn’s death (and long after Polanyi’s), he revised those concessions and claimed that Polanyi had not in fact had a great influence on him, and that he had delayed reading Personal Knowledge until after finishing Structure out of a fear that he “would have to go back to first principles and start over again, and I wasn’t going to do that.”
Despite the fact that Polanyi’s work preceded Kuhn’s and was more philosophically rigorous, it was Kuhn whose book became a bestseller and whose terminology entered contemporary parlance. Steve Fuller notes “many Kuhn-like ideas were ‘in the air’ both before and during the time Structure was written,” often from better-known philosophers. Perhaps Kuhn simply hit not only on the right ideas, but more importantly on the right distillation of them, and the right terminology, at the right time.
The reader of Kuhn’s work is struck by his extensive focus on the physical sciences, and the dearth of attention to biology and the social sciences. To some extent, this is hardly surprising, given Kuhn’s background as a theoretical physicist. But it is also true that the public prominence of the physical sciences in the first half of the twentieth century and the early periods of the Cold War provided a unique window into the community of scientists and the patterns by which scientific theory develops.
What Kuhn noticed was that competing paradigms in physics never coexist for very long, and that progress in normal science occurs precisely when scientists work within only one paradigm. But the social sciences are a special kind of science, because they cannot set aside fundamental philosophical concerns as easily as the physical sciences. Moreover, the social sciences are defined by multiple paradigms that are sometimes mutually contradictory. Kuhn pointed out that some social sciences may never be able to enter the paradigmatic stage of normal science for that reason. Unlike physical scientists, social scientists generally cannot in the face of a disagreement revert to an agreed-upon exemplary solution to a problem; their controversies are precisely about what the exemplar ought to be. The social sciences are grounded on competing views of what the world is and should be: certain basic concepts, such as “the state,” “institutions,” or “identity,” cannot be defined by consensus. Competing paradigms — such as those of Marxist, Keynesian, and Hayekian economists — will continue to coexist. So there necessarily will be limits to what the social sciences can achieve, since the lack of unanimity inevitably means that arguments turn on questions of theory, rather than on the application of theory. In addition, since it is more difficult in the social sciences to carry out true experiments and test counterfactuals, the social sciences are inhibited from closely following the model of the physical sciences. And the passage of time is a relevant factor. As social scientist Wolfgang Streeck explains, “What has historically happened cannot be undone — which also means that there can never be an exact return to a past condition, as the memory of what happened in between will always be present. A military dictatorship that has returned after having overthrown a democracy is not the same as a military dictatorship following, say, a foreign occupation.”
Despite these criticisms, many social scientists embraced — or perhaps appropriated — Kuhn’s thesis. It enabled them to elevate the status of their work. The social sciences could never hope to meet the high standards of empirical experimentation and verifiability that the influential school of thought called positivism demanded of the sciences. But Kuhn proposed a different standard, by which science is actually defined by a shared commitment among scientists to a paradigm wherein they refine and apply their theories. Although Kuhn himself denied the social sciences the status of paradigmatic science because of their lack of consensus on a dominant paradigm, social scientists argued that his thesis could still apply to each of those competing paradigms individually. This allowed social scientists to claim that their work was scientific in much the way Kuhn described physics to be.
Disagreements over what counts as science, and how society can hold scientists in any field accountable to a standard of truth, became most heated in the aftermath of a debate between Kuhn and the philosopher Karl Popper. The now-famous debate between Kuhn and the older and far more seasoned Popper took place in London on July 13, 1965. Although no particularly significant exchange between the two took place either before or after this encounter, their disagreement is commonly featured in textbooks and college courses as a major event in the development of the philosophy of science in the twentieth century. The popular view of the conflict, advanced primarily by supporters of Kuhn — the supposed winner of the debate — is that Kuhn was a revolutionary in his field who championed free inquiry, in opposition to the strict empirical and logical standards of the positivists. Popper, on the other hand, is often taken to be a quasi-positivist defender of the authority of science. But, as Steve Fuller argues in his 2003 book Kuhn vs. Popper: The Struggle for the Soul of Science, this popular conception is not only a caricature but an inversion of the truth about these two thinkers.
Popper held science to a higher standard than did Kuhn. Popper’s famous proposition was that a seemingly scientific claim, in order to be actually scientific, must be falsifiable, meaning that it is possible to devise an experiment under which the claim could be disproved. A classic example of a falsifiable science is Einsteinian physics, which made specific, well-defined predictions that could be tested through observation — as opposed to, say, Freudian psychology, which did not make well-defined predictions and proved adept at reformulating its explanations to fit observations, changing the details so as to salvage the theory.
By defining science in terms of rational criteria of empirical observation, Popper seemed to place scientific tools equally in the hands of philosophers of science, skeptics, and common persons who needed some means to question scientists who tried to back their claims by appealing to their own scientific authority. For Popper, novel scientific theories should be greeted with skepticism from the outset. But for Kuhn, one of the key characteristics of the healthy functioning of the community of scientists is its practice of singling out a successful theory from its competitors — without concern for its social implications, and in isolation from public scrutiny.
In a sense, Popper and Kuhn each saw himself as a defender of free inquiry — but their notions of free inquiry were fundamentally opposed. Kuhn’s thesis reserved free inquiry specifically for scientists, by considering legitimate whatever paradigm scientists happened to agree upon at a given time. But Popper, given his longstanding concern for the open society, thought that this idea marginalized the role of skepticism, only regarding it as important at the point of crisis, and that it thus undermined free inquiry as a methodological commitment to truth.
Popper particularly targeted the tendency among some influential social scientists to advance their political and social theories without revealing their philosophical underpinnings. Some of the great catastrophes of the twentieth century resulted from the widespread acceptance of theories that reduced society to a machine that could be steered by competent authorities. Popper’s falsification principle was meant in part to moderate the authority of social science, which — to the extent that it attempted to predict and regulate society — could lead to a passive public and technocratic governance at best, or modern serfdom and totalitarianism at worst. Kuhn himself was hardly a great booster of the social sciences. But the application of Kuhn’s ideas to social science seemed to imply that a theory, however false, should be allowed to dominate the opinion of scientists and the public until it buckles under the weight of its own flaws.
For their part, Kuhn and his followers argued that Popperian falsifiability was an impossible and historically unrealistic standard for science, and noted that any paradigm has at least a few anomalies. In fact, these anomalies are critical for determining which puzzles normal science seeks to solve. Popper’s standard, on the other hand, would seem to require scientists to be forever preoccupied with metaphysical, pre-paradigmatic arguments. But in a sense, this was the point: Popper’s insistence on falsification was precisely meant to sustain the need of the social sciences to focus on questions of first principle, so as to avoid the rise of any new dangerous philosophies falsely carrying the banner of science.
While the physical sciences were the most prominent in the public mind when Kuhn was writing Structure in the early 1960s, today biology is in ascendance. It is striking, as Hacking notes in his introductory essay, that Kuhn does not explore whether Darwin’s revolution fits within his thesis. It is far from clear that Kuhn’s thesis can adequately account for not only Darwin’s revolution but also cell theory, Mendelian or molecular genetics, or many of the other major developments in the history of biology.
The differences between physics and biology — their varying methods and metaphors — matter immensely for the way we understand ourselves and our world. Beginning in the mid-nineteenth century, the assumptions of modern science began to play a much more prominent role in political philosophy. A scientific way of thinking permeated the writings of Auguste Comte and Karl Marx, and by the end of the century, with the work of Max Weber and Émile Durkheim, the era of social science had begun in earnest. Many of the early social scientists came to view society in terms of contemporary physics; they adopted the Enlightenment belief in science as the source of progress, and considered physics the archetypical science. They understood society as a mechanism that could be engineered and adjusted. These early social scientists began to deem philosophical questions irrelevant or even inappropriate to their work, which instead became about how the mechanism of society operated and how it could be fixed. The preeminence of physics and mechanistic thinking was passed down through generations of social scientists, with qualitative characterization considered far less valuable and less “scientific” than quantitative investigations. Major social scientific theories, from behaviorism to functionalism to constructivism and beyond, tacitly think of man and society as machines and systems.
Given the dominance of physics and mechanism in social scientific thinking, the fact that Kuhn based his thesis almost exclusively on physics gave social scientists reason to consider their philosophical commitments legitimate. They saw Structure as a confirmation of their entire approach.
But in the half century since Kuhn wrote his book, biology has taken the place of physics as the dominant science — and so in the social sciences, the conception of society as a machine has gone out of vogue. Social scientists have increasingly turned to biology and ecology for possible analogies on which to build their social theories; organisms are supplanting machines as the guiding metaphor for social life. In 1991, the Journal of Evolutionary Economics was launched with an eye toward advancing a Darwinian understanding of economics, complete with genotypes and phenotypes. The justification for this kind of model is straightforward: one of the biggest difficulties for economists is the dynamism of any given economy. As Joseph Schumpeter rightly pointed out, economies change; they evolve, rather than staying fixed like a Newtonian machine with merely moving parts. Since machines do not change, whereas societies do, it is reasonable to move the study of economics away from the metaphor of systems and toward that of organisms.
A recent paper in the journal Theory in Biosciences perfectly encapsulates the desire for a more biological perspective in the social sciences, arguing for “Taking Evolution Seriously in Political Science.” The paper outlines the deterministic dangers in the view of social systems as Newtonian machines, as well as the problems posed by the reductionist belief that elements of social systems can be catalogued and analyzed. By contrast, the paper argues that approaching social sciences from an evolutionary perspective is more appropriate philosophically, as well as more effective for scientific explanation. This approach allows us to examine the dynamic nature of social changes and to explain more consistently which phenomena last, which disappear, and which are modified, while still confronting persistent questions, such as why particular institutions change.
This shift from a mechanistic to an evolutionary model seems like a step in the right direction. The new model aims less at predicting the future and derives its strength instead from its apparent ability to explain a wide array of phenomena. It may be better equipped than its predecessor to account for the frequent changes in the stability of modern economies. Furthermore, a biological model can correctly recognize humans as purposeful and creative beings, whereas mechanistic models reduce people to objects that merely react to outside stimuli.
Nevertheless, a biological approach to the social sciences is reductionistic in its own way, and limited in what it can explain. Biological sciences, much like physical sciences, have been stripped of philosophical concerns, of questions regarding the soul or the meaning of life, which have been pushed off to the separate disciplines of philosophy and theology. Much of modern biology seeks to emulate physics by reducing the human organism to a complex machine: thinking becomes merely chemical potentials and electric bursts, interest and motivation become mere drives to perpetuate the genome, and love becomes little more than an illusion. Such accounts can become problematic if we consider them the only ways to understand human nature — and not least because our answers to these non-scientific questions are at the foundation of how we view the world, and so also of how we interpret scientific findings.
Every model that social scientists use, whether it is derived from physics, biology, or ecology, embodies certain philosophical assumptions about human nature and about the optimal functioning of a society. Viewing social relations as movements of a clock implies a set of beliefs quite unlike those of perceiving the same relations as functions of a cell. Since the work of social scientists is so closely tied to these philosophical concerns on which we tend to disagree, we usually see a number of models compete for acceptance at the same time. And because these metaphysical assumptions are usually unspoken, they set the stage for the competition between models to take the place of what was once an explicit competition between differing philosophical accounts of the world — only now while largely denying that any philosophical debate is taking place.
Perhaps the greatest limitation in the social sciences is that, however good a theory’s explanatory abilities, it can say very little about whether or not a particular action ought to be performed in order to bring about social change. Since human relations are the object of the social sciences, questions of ethics — about whether or not a change should be induced, who should be responsible for it, and how it should occur — must always be at the forefront. It may be desirable, for instance, to reduce alcoholism; but it does not follow that all actors, such as churches, governments, businesses, public and private mental-health experts, and the pressure of social norms, are equally responsible for undertaking the task, or can equally do so without altering society in other ways. Decisions of this sort inevitably depend on our views of the proper function of institutions and on what constitutes the well-being of society.
Regardless of whether we view society as akin to a physical machine, or a biosphere, or an organism, it remains crucial that we recognize the limitations of each model. But what we learn from Kuhn is that any science that separates itself from its philosophical bases renders itself incapable of addressing such questions even within its own limited scope.
The political philosopher Eric Voegelin, in his 1952 book The New Science of Politics, provides a helpful treatment on this point in his assessment of the fifteenth-century English judge Sir John Fortescue. Long before the current trend toward the biological sciences, Fortescue used a biological metaphor, arguing, as Voegelin writes, “that a realm must have a ruler like a body a head,” and that a political community grows into an articulate, defined body as though out of an embryo. Rulers were necessary because otherwise the community would be, in Voegelin’s words, “acephalus, headless, the trunk of a body without a head.” Yet Fortescue recognized that the analogy between an organic body and a political realm was limited: by itself, it would have provided an incomplete view of both the individual and society. He therefore introduced into his political theory the Christian notion of a corpus mysticum: society is held together not only by a head but also by an inner spiritual bond, a heart that nourishes the head as well as the rest of the body. As Voegelin puts it, however, this heart “does not serve as the identification of some member of a society with a corresponding organ of the body, but, on the contrary, it strives to show that the animating center of a social body is not to be found in any of its human members ... but is the intangible living center of the realm as a whole.”
By extending the analogy in this way, Fortescue went beyond what we now recognize as the limits of biology, and even of political science as such, in the attempt to capture a fuller sense of human nature and of a political body. Neither biology nor political science by itself would have been capable of producing any such holistic image of society. Most significantly, Fortescue understood that his borrowing from biology was merely metaphorical — and so avoided the mistake that plagues the social sciences today, of treating what is really political theory as straightforward scientific truth.
Value judgments are always at the core of the social sciences. “In the end,” wrote Irving Kristol, “the only authentic criterion for judging any economic or political system, or any set of social institutions, is this: what kind of people emerge from them?” And precisely because we differ on what kind of people should emerge from our institutions, our scientific judgments about them are inevitably tied to our value commitments.
But this is not to say that those values, or the scientific work that rests on them, cannot be publicly debated according to recognized standards. Thomas Kuhn’s thesis has often been taken to mean that choices between competing theories or paradigms are arbitrary — merely a matter of subjective taste. As noted earlier, Kuhn challenged the claim that he was a relativist in a 1973 lecture, offering a list of five standards by which we may defend the superiority of one theory over another: accuracy, consistency, scope, simplicity, and fruitfulness. What these criteria precisely mean, how they apply to a given theory, and how they rank in priority are themselves questions subject to dispute by scientists committed to opposing theories. But it is the existence of recognized standards, even if the standards are open to debate, that allows any judgment to be available for public discussion. And we may add that if social scientists recognize the same standards, then debates over their meaning, application, and priority are harder to settle than in physics because the social sciences are intertwined with philosophical questions that are themselves concerned with what our standards of rationality ought to be.
The lasting value of Kuhn’s thesis in The Structure of Scientific Revolutions is that it reminds us that any science, however apparently purified of the taint of philosophical speculation, is nevertheless embedded in a philosophical framework — and that the great success of physics and biology is due not to their actual independence from philosophy but rather to physicists’ and biologists’ dismissal of it. Those who are inclined to take this dismissal as meaning that philosophy is dead altogether, or has been replaced by science, will do well to recognize the force by which Kuhn’s thesis opposes this stance: History has repeatedly demonstrated that periods of progress in normal science — when philosophy seems to be moot — may be long and steady, but they lead to a time when non-scientific, philosophical questions again become paramount.
One persisting trouble with Kuhn’s classic work is that its narrow focus left too many questions unanswered — including the question not just of what science is but of what science should be. Here many other philosophers of science, including Popper, offered not just descriptions of science but powerful prescriptions for it. Kuhn’s work is largely silent on the value of science and the wellbeing of society, and entirely silent on the wrongheadedness of blindly accepting scientific authority and discarding the philosophical questions that must always come first, even when we pretend otherwise.
Although Kuhn, who died in 1996, was sometimes stung by the criticism he received, he understood the importance of all the poking and prodding. In his 1973 lecture, he argued that “scientists may always be asked to explain their choices, to exhibit the bases of their judgments. Such judgments are eminently discussable, and the man who refuses to discuss his own cannot expect to be taken seriously.” Even the great Einstein, who failed to give a full defense for his skepticism of the fundamental randomness posited by quantum theory, became somewhat marginalized later in his career. Kuhn deserves the respect of the rigorous criticism that has come his way. It is fitting that his provocative thesis has faced blistering scrutiny — and remarkable that it has survived to instruct and vex us five decades later.


Matthew C. Rees is a graduate student in International Relations and European Studies at Masaryk University in Brno, Czech Republic.

What we really should worry about

It is time for the annual hadj to http://edge.org/responses/q2013 for the answers to their question of 2013: what should we be worried about. A lot of them are fascinating, if somewhat depressing. Here is one, to tempt you.
--ww--

The universe is relentlessly, catastrophically dangerous, on scales that menace not just communities, but civilizations and our species as well. A freakish chain of improbable accidents produced the bubble of conditions that was necessary for the rise of life, our species, and technological civilization. If we continue to drift obliviously inside this bubble, taking its continuation for granted, then inevitably—sooner or later—physical or human-triggered events will push us outside, and we will be snuffed like a candle in a hurricane.
We are menaced by gamma ray bursts (that scrub major regions of their galaxies free of life); nearby supernovae; asteroids and cometary impacts (which strike Jupiter every year or two); Yellowstone-like supereruptions (the Toba supereruption was a near extinction-event for humans), civilization-collapsing coronal mass ejections (which would take down the electrical grids and electronics underlying technological civilization in a way that they couldn't recover from, since their repair requires electricity supplied by the grid; this is just one example of the more general danger posed by the complex, fragile interdependence inherent in our current technology); and many other phenomena including those unknown to us. Here is one that no one talks about: The average G-type star shows a variability in energy output of around 4%. Our sun is a typical G-type star, yet its observed variability in our brief historical sample is only 1/40th of this. When or if the Sun returns to more typical variation in energy output, this will dwarf any other climate concerns.
The emergence of science as a not wholly superstitious and corrupt enterprise is slowly awakening our species to these external dangers. As the brilliant t-shirt says, an asteroid is nature's way of asking how your space program is doing. If we are lucky we might have time to build a robust, hardened planetary and extraplanetary hypercivilization able to surmount these challenges. Such a hypercivilization would have to be immeasurably richer and more scientifically advanced to prevent, say, the next Yellowstone supereruption or buffer a 2% drop in the Sun's energy output. (Indeed, ice ages are the real climate-based ecological disasters and civilization-enders—think Europe and North America under a mile of ice). Whether we know it or not, we are in a race to forge such a hypercivilization before these blows fall. If these threats seem too distant, low probability, or fantastical to belong to the "real" world, then let them serve as stand-ins for the much larger number of more immediately dire problems whose solutions also depend on rapid progress in science and technology.
This raises a second category of menaces—hidden, deadly, ever-adapting, already here—that worry me even more: the evolved monsters from the id that we all harbor (e.g., group identity, the appetite for prestige and power, etc.), together with their disguised offspring, the self-organizing collective delusions that we all participate in, and mistake for reality. (As the cognoscenti know, the technical term monsters from the id originated in Forbidden Planet.) We need to map and master these monsters and the dynamics through which they generate collective delusions if our societies are to avoid near-term, internally generated failure.
For example, cooperative scientific problem-solving is the most beautifully effective system for the production of reliable knowledge that the world has ever seen. But the monsters that haunt our collective intellectual enterprises typically turn us instead into idiots. Consider the cascade of collective cognitive pathologies produced in our intellectual coalitions by ingroup tribalism, self-interest, prestige-seeking, and moral one-upsmanship: It seems intuitive to expect that being smarter would lead people to have more accurate models of reality. On this view, intellectual elites therefore ought to have better beliefs, and should guide their societies with superior knowledge. Indeed, the enterprise of science is—as an ideal—specifically devoted to improving the accuracy of beliefs. We can pinpoint where this analysis goes awry, however, when we consider the multiple functions of holding beliefs. We take for granted that the function of a belief is to be coordinated with reality, so that when actions are based on that belief, they succeed. The more often beliefs are tested against reality, the more often accurate beliefs displace inaccurate ones (e.g., through feedback from experiments, engineering tests, markets, natural selection). However, there is a second kind of function to holding a belief that affects whether people consciously or unconsciously come to embrace it—the social payoffs from being coordinated or discoordinated with others' beliefs (Socrates' execution for "failing to acknowledge the gods the city acknowledges"). The mind is designed to balance these two functions: coordinating with reality, and coordinating with others. The larger the payoffs to social coordination, and the less commonly beliefs are tested against reality, then the more social demands will determine belief—that is, network fixation of belief will predominate. Physics and chip design will have a high degree of coordination with reality, while the social sciences and climatology will have less.
Because intellectuals are densely networked in self-selecting groups whose members' prestige is linked (for example, in disciplines, departments, theoretical schools, universities, foundations, media, political/moral movements, and other guilds), we incubate endless, self-serving elite superstitions, with baleful effects: Biofuel initiatives starve millions of the planet's poorest. Economies around the world still apply epically costly Keynesian remedies despite the decisive falsification of Keynesian theory by the post-war boom (government spending was cut by 2/3, 10 million veterans dumped into the labor force, while Samuelson predicted "the greatest period of unemployment and industrial dislocation which any economy has ever faced"). I personally have been astonished over the last four decades by the fierce resistance of the social sciences to abandoning the blank slate model in the face of overwhelming evidence that it is false. As Feynman pithily put it, "Science is the belief in the ignorance of experts."
Sciences can move at the speed of inference when individuals only need to consider logic and evidence. Yet sciences move glacially (Planck's "funeral by funeral") when the typical scientist, dependent for employment on a dense ingroup network, has to get the majority of her guild to acknowledge fundamental, embarrassing disciplinary errors. To get science systematically moving at the speed of inference—the key precondition to solving our other problems—we need to design our next generation scientific institutions to be more resistant to self-organizing collective delusions, by basing them on a fuller understanding of our evolved psychology.
# # #

Ten most disturbing scientific discoveries

Here is an article from the Smithsonian website. Exercise for the reader: jot down your short list of disturbing scientific discoveries first. Then read on.
--ww--


Science can be glorious; it can bring clarity to a chaotic world. But big scientific discoveries are by nature counterintuitive and sometimes shocking. Here are ten of the biggest threats to our peace of mind.
1. The Earth is not the center of the universe.
We’ve had more than 400 years to get used to the idea, but it’s still a little unsettling. Anyone can plainly see that the Sun and stars rise in the east, sweep across the sky and set in the west; the Earth feels stable and stationary. When Copernicus proposed that the Earth and other planets instead orbit the Sun,
… his contemporaries found his massive logical leap “patently absurd,” says Owen Gingerich of the Harvard-Smithsonian Center for Astrophysics. “It would take several generations to sink in. Very few scholars saw it as a real description of the universe.”
Galileo got more grief for the idea than Copernicus did. He used a telescope to provide evidence for the heliocentric theory, and some of his contemporaries were so disturbed by what the new invention revealed—craters on a supposedly perfectly spherical moon, other moons circling Jupiter—that they refused to look through the device. More dangerous than defying common sense, though, was Galileo’s defiance of the Catholic Church. Scripture said that the Sun revolved around the Earth, and the Holy Office of the Inquisition found Galileo guilty of heresy for saying otherwise.
2. The microbes are gaining on us.
Antibiotics and vaccines have saved millions of lives; without these wonders of modern medicine, many of us would have died in childhood of polio, mumps or smallpox. But some microbes are evolving faster than we can find ways to fight them.
The influenza virus mutates so quickly that last year’s vaccination is usually ineffective against this year’s bug. Hospitals are infested with antibiotic-resistant Staphylococcus bacteria that can turn a small cut into a limb- or life-threatening infection. And new diseases keep jumping from animals to humans—ebola from apes, SARS from masked palm civets, hantavirus from rodents, bird flu from birds, swine flu from swine. Even tuberculosis, the disease that killed Frederic Chopin and Henry David Thoreau, is making a comeback, in part because some strains of the bacterium have developed multi-drug resistance. Even in the 21st century, it’s quite possible to die of consumption.
3. There have been mass extinctions in the past, and we’re probably in one now.
Paleontologists have identified five points in Earth’s history when, for whatever reason (asteroid impact, volcanic eruptions and atmospheric changes are the main suspects), mass extinctions eliminated many or most species.
The concept of extinction took a while to sink in. Thomas Jefferson saw mastodon bones from Kentucky, for example, and concluded that the giant animals must still be living somewhere in the interior of the continent. He asked Lewis and Clark to keep an eye out for them.
Today, according to many biologists, we’re in the midst of a sixth great extinction. Mastodons may have been some of the earliest victims. As humans moved from continent to continent, large animals that had thrived for millions of years began to disappear—mastodons in North America, giant kangaroos in Australia, dwarf elephants in Europe. Whatever the cause of this early wave of extinctions, humans are driving modern extinctions by hunting, destroying habitat, introducing invasive species and inadvertently spreading diseases.
4. Things that taste good are bad for you.
In 1948, the Framingham Heart Study enrolled more than 5,000 residents of Framingham, Massachusetts, to participate in a long-term study of risk factors for heart disease. (Very long term—the study is now enrolling the grandchildren of the original volunteers.) It and subsequent ambitious and painstaking epidemiological studies have shown that one’s risk of heart disease, stroke, diabetes, certain kinds of cancer and other health problems increases in a dose-dependent manner upon exposure to delicious food. Steak, salty French fries, eggs Benedict, triple-fudge brownies with whipped cream—turns out they’re killers. Sure, some tasty things are healthy—blueberries, snow peas, nuts and maybe even (oh, please) red wine. But on balance, human taste preferences evolved during times of scarcity, when it made sense for our hunter-gatherer ancestors to gorge on as much salt and fat and sugar as possible. In the age of Hostess pies and sedentary lifestyles, those cravings aren’t so adaptive.
5. E=mc²
Einstein’s famous equation is certainly one of the most brilliant and beautiful scientific discoveries—but it’s also one of the most disturbing. The power explained by the equation really rests in the c², or the speed of light (186,282 miles per second) times itself, which equals 34,700,983,524. When that’s your multiplier, you don’t need much mass—a smidgen of plutonium is plenty—to create enough energy to destroy a city.
6. Your mind is not your own.
Freud might have been wrong in the details, but one of his main ideas—that a lot of our behaviors and beliefs and emotions are driven by factors we are unaware of—turns out to be correct. If you’re in a happy, optimistic, ambitious mood, check the weather. Sunny days make people happier and more helpful. In a taste test, you’re likely to have a strong preference for the first sample you taste—even if all of the samples are identical. The more often you see a person or an object, the more you’ll like it. Mating decisions are based partly on smell. Our cognitive failings are legion: we take a few anecdotes and make incorrect generalizations, we misinterpret information to support our preconceptions, and we’re easily distracted or swayed by irrelevant details. And what we think of as memories are merely stories we tell ourselves anew each time we recall an event. That’s true even for flashbulb memories, the ones that feel as though they’ve been burned into the brain:
Like millions of people, [neuroscientist Karim] Nader has vivid and emotional memories of the September 11, 2001, attacks and their aftermath. But as an expert on memory, and, in particular, on the malleability of memory, he knows better than to fully trust his recollections… As clear and detailed as these memories feel, psychologists find they are surprisingly inaccurate.
7. We’re all apes.
It’s kind of deflating, isn’t it? Darwin’s theory of evolution by natural selection can be inspiring: perhaps you’re awed by the vastness of geologic time or marvel at the variety of Earth’s creatures. The ability to appreciate and understand nature is just the sort of thing that is supposed to make us special, but instead it allowed us to realize that we’re merely a recent variation on the primate body plan. We may have a greater capacity for abstract thought than chimps do, but we’re weaker than gorillas, less agile in the treetops than orangutans and more ill-tempered than bonobos.
Charles Darwin started life as a creationist and only gradually came to realize the significance of the variation he observed in his travels aboard the Beagle. For the past 151 years, since On the Origin of Species was published, people have been arguing over evolution. Our ape ancestry conflicts with every culture’s creation myth and isn’t particularly intuitive, but everything we’ve learned since then—in biology, geology, genetics, paleontology, even chemistry and physics—supports his great insight.
8. Cultures throughout history and around the world have engaged in ritual human sacrifice.
Say you’re about to die and are packing some supplies for the afterlife. What to take? A couple of coins for the ferryman? Some flowers, maybe, or mementos of your loved ones? If you were an ancient Egyptian pharaoh, you’d have your servants slaughtered and buried adjacent to your tomb. Concubines were sacrificed in China to be eternal companions; certain Indian sects required human sacrifices. The Aztecs slaughtered tens of thousands of people to inaugurate the Great Pyramid of Tenochtitlan; after sacred Mayan ballgames, the losing team was sometimes sacrificed.
It’s hard to tell fact from fiction when it comes to this particularly gruesome custom. Ritual sacrifice is described in the Bible, Greek mythology and the Norse sagas, and the Romans accused many of the people they conquered of engaging in ritual sacrifice, but the evidence was thin. A recent accumulation of archaeological findings from around the world shows that it was surprisingly common for people to ritually kill—and sometimes eat—other people.
9. We’ve already changed the climate for the rest of this century.
The mechanics of climate change aren’t that complex: we burn fossil fuels; a byproduct of that burning is carbon dioxide; it enters the atmosphere and traps heat, warming the surface of the planet. The consequences are already apparent: glaciers are melting faster than ever, flowers are blooming earlier (just ask Henry David Thoreau), and plants and animals are moving to more extreme latitudes and altitudes to keep cool.
Even more disturbing is the fact that carbon dioxide lingers in the atmosphere for hundreds of years. We have just begun to see the effects of human-induced climate change, and the predictions for what’s to come range from dire to catastrophic.
10. The universe is made of stuff we can barely begin to imagine.
Everything you probably think of when you think of the universe—planets, stars, galaxies, black holes, dust—makes up just 4 percent of whatever is out there. The rest comes in two flavors of “dark,” or unknown stuff: dark matter, at 23 percent of the universe, and dark energy, at a whopping 73 percent:
Scientists have some ideas about what dark matter might be—exotic and still hypothetical particles—but they have hardly a clue about dark energy. … University of Chicago cosmologist Michael S. Turner ranks dark energy as “the most profound mystery in all of science.”
The effort to solve it has mobilized a generation of astronomers in a rethinking of physics and cosmology to rival and perhaps surpass the revolution Galileo inaugurated on an autumn evening in Padua. … [Dark energy] has inspired us to ask, as if for the first time: What is this cosmos we call home?
But astronomers do know that, thanks to these dark parts, the universe is expanding. And not only expanding, but expanding faster and faster. Ultimately, everything in the universe will drift farther and farther apart until the universe is uniformly cold and desolate. The world will end in a whimper.
# # #


P.S.: One of the most interesting articles that I read in the past year points out that all of the phenomena for which dark energy and dark matter were postulated to explain can alternatively be explained by postulating that time fluctuates. For a billion years, it goes by a little more quickly. For another billion years, it goes by a little more slowly. The amount of variation that is needed is small. By Occam's razor, that is a simpler theory than dark matter. On the other hand, we are just now beginning to learn fundamental facts about neutrinos. Check out this fascinating article: http://www.sciencenews.org/view/feature/id/347453/description/Heart_of_the_Matter. The article starts slowly, but by the time you get to "sterile neutrinos", you'll be hooked.
# # #