Long Bets is a project to encourage long term thinking and outlooks. It's pretty interesting and more interesting to me than, say, Fantasy Football...maybe even more interesting than Fantasy Curling...
I heard about it in an article sent to me by
Cassie today from the New York Times. I'm posting it here in its entirety lest it be relegated to the unread-pay-per-view online NYT.
January 30, 2007
Findings
Can Humanity Survive? Want to Bet on It?
By JOHN TIERNEY
Sixty ago years, a group of physicists concerned about nuclear weapons created the Doomsday Clock and set its hands at seven minutes to midnight. Now, the clock’s keepers, alarmed by new dangers like climate change, have moved the hands up to 11:55 p.m.
My first reaction was a sigh of relief. After all, the 1947 doomsday prediction marked the start of a golden age. Never have so many humans lived so long — and maybe never so peacefully — as during the past 60 years. The per-capita rate of violence, particularly in the West, seems remarkably low by historical standards. If the clock’s keepers are worried once again, their track record suggests we’re in for even happier days.
But there’s one novel twist that gives me pause. When the Bulletin of the Atomic Scientists announced two weeks ago in Washington that it was adjusting the clock, it was joined in a trans-Atlantic press conference by scientists at the Royal Society in London. One of them was the society’s president, Martin Rees, a new breed of doomsayer.
Dr. Rees, a cosmologist at Cambridge and Britain’s astronomer royal, doesn’t just issue gloomy predictions. He doesn’t just move the hands of an imaginary and inscrutable clock. (Its keepers have never explained what one of their minutes equals on anyone else’s clock or calendar.)
No, Dr. Rees is braver. He gives odds on doomsday and offers to bet on disaster. In his 2003 book, “Our Final Hour,” he gives civilization no more than a 50 percent chance of surviving until 2100.
Dr. Rees is not a knee-jerk technophobe — he expects great advances as researchers around the world link their knowledge — but he fears that progress will be undone by what he calls the new global village idiots. He’s sure enough of himself to post an offer on Long Bets, a clever innovation on the Web that Stewart Brand helped start with money from Jeff Bezos, the founder of Amazon.com.
Long Bets is a nonprofit foundation that calls itself an “arena for competitive, accountable predictions.” It lets anyone make a prediction and take wagers on it, with the proceeds going to a charity named by the winner. The bets made so far are from $200 to $10,000, on topics ranging from the driving habits of Americans in 2010 to whether the universe will stop expanding. Mitchell Kapor, the software guru, is betting that in 2029 no computer will have passed the Turing test (by conversing so much like a human that you couldn’t tell the difference). The physicist Freeman Dyson’s money is on the first extraterrestrial life’s being found somewhere other than a planet or its satellite.
Five years ago, Dr. Rees posted this prediction: “By 2020, bioterror or bioerror will lead to one million casualties in a single event.” He reasoned that “by 2020 there will be thousands — even millions — of people with the capability to cause a catastrophic biological disaster. My concern is not only organized terrorist groups, but individual weirdos with the mindset of the people who now design computer viruses.”
He didn’t get any takers on LongBets.org, which seems to me a missed opportunity. So I’ve posted an offer there to bet him $200 — not a huge sum, but enough to put both our reputations on the line. I realize that betting on disaster may sound ghoulish, but neither of us will personally profit (if I win, the money goes to the International Red Cross). And I think bets like this serve a purpose.
Besides stimulating public debate, they focus the issue and discipline prophets. No matter how good their intentions, prophets face strong temptations to hype. In the current issue of The Bulletin of the Atomic Scientists, Dr. Rees wryly describes what happened in 2003 when he turned in a manuscript titled, “Our Final Century?”
“My British publisher removed the question mark from the book’s title,” he recalls, “and the U.S. publisher changed it to ‘Our Final Hour.’ Pessimism, it seems, makes for better marketing.”
It doesn’t make for better public policy though. Heralds of the bioterror apocalypse have actually worsened the problem of bioterror, as Milton Leitenberg points out in a 2005 report for the Strategic Studies Institute of the United States Army War College.
Mr. Leitenberg is a scholar at the University of Maryland who has been studying biological weapons for decades — and debunking wild predictions. Dr. Rees is not alone. Senator Bill Frist called bioterrorism “the greatest existential threat we have in the world today” and urged a military effort that “even dwarfs the Manhattan Project.”
Such rhetoric, Mr. Leitenberg says, has had the perverse effect of encouraging terrorists to seek out biological weapons. But despite the much-publicized attempts of Al Qaeda and a Japanese group to go biological, terrorists haven’t had much luck, because it’s still quite hard for individuals or nongovernmental groups to obtain, manufacture or deploy biological weapons of mass destruction.
Mr. Leitenberg says the biggest threat is of a state deploying biological weapons, and he notes the encouraging decline in the number of countries working on this technology. Meanwhile, though, America has been so spooked by the horror-movie scenarios that it’s pouring money into defense against biological weapons. Dr. Leitenberg says that’s a mistake, both because it diverts resources from more serious threats — like natural diseases and epidemics — and because it could start a new biological arms race as other countries understandably fear that the United States is doing more than just playing defense.
It’s possible, as Dr. Rees fears, that terrorists will get a lot more sophisticated at biotech in the next decade, or that researchers will make some terrible mistake. The technology is getting cheaper and spreading rapidly. But so are the tools for preventing and coping with mistakes.
Whatever happens, I don’t expect biotechnology to pose an “existential threat.” The disaster predicted by Dr. Rees would be horrific, but humanity has survived worse, like the flu epidemic of 1918 that killed tens of millions of people. I know there are fears of new microorganisms or nanobots gobbling up our species, but I’m confident we’d somehow stop the Doomsday Clock from striking midnight.
In fact, the wager I’d really like to make with Dr. Rees is that we’ll make it to 2100. I’ve posted that prediction on Long Bets, and I’d be glad to give him better odds than the 50-50 chance he gives civilization of surviving the century.
I even think one of us might survive to see the payoff, although my techno-optimism has its limits. I hope some version of me will be around in 2100, but I wouldn’t bet on it.