SO FAR, THE SINGULARITY VOLUNTEER FIRE DEPT. HAS BEEN SOUNDING TEN ALARMS WHILE RUSHING AROUND TRYING TO FIND SMOKE

I don’t often experience writer’s block. Sleeping on a topic overnight is nearly always enough to return a free flow of ideas and images. But it was not working that way with this thing called The Singularity. For days, I tried without success to tie a literary bow around a supposition that had fast become a phenomenon that is now on the verge of becoming the first Great Technological Religion. In repeated stare-downs with my computer screen, I lost.

In a moment, I’ll share what finally dissolved the plaque in my creative arteries on this subject, but first I may need to introduce you to the high drama and low wattage of the whole Singularity debate.

The word first appeared in a 1993 essay written by a California math professor, Vernor Steffen Vinge. The full title was “The Coming Technological Singularity.” Professor Vinge was not the first to raise the issue. But he was the first to supply a name worthy of building a whole “end of the world at least as we know it”-fearing movement around this idea: that computer and other technologies are hurdling toward a time when humans may not be the smartest intelligences on the planet. Why? Because some kind of artificial intelligence (“AI”) will have surpassed us, bringing an end to the human era.

Dr. Vinge is now retired. But his Singularity idea has become another of those Californications that sucked the air out of intellectually tinged, futuristically oriented salons and saloons faster than a speeding epiphany. The relentless personality under the hood of the Singularity phenomenon is a talented 70-year-old inventor and big-screen-thinking, oft-honored futurist from New York City and MIT named Ray Kurzweil.

Where “My Way” Is the Theme Song
A few years ago, I wrote about The Singularity movement just after it had finished what one irreverent observer had called Kurzweil’s “yearly Sinatra at Caesar’s.” He was referring to Singularity Summit, the annual conference of the Machine Intelligence Research Institute. As best I can tell, the last one of these events was in 2012.

Attendees usually listened to, among others, futurist Kurzweil explain how he believed with all his heart that unimaginably powerful computers are soon going to be able to simulate the human brain, then far surpass it. He thinks great, wondrous, positive things will be possible for humanity because of this new capability. If you track Kurzweil’s day-to-day activities and influence, you quickly realize that he’s not so much Singularity’s prophet as its evangelist. His zeal is messianic. And he’s constantly on the prowl for new believers in a funky techno-fringe movement that is definitely showing legs.

Consider these developments:

• Not long ago, no less than four documentary movies were released within a year’s time on The Singularity. One debuted at the Tribeca Film Festival and also was shown at the AFI Fest in Los Angeles. “Transcendent Man” features or rather lionizes—who else?—Ray Kurzweil. The film is loosely based on his book, The Singularity Is Near. Movies called “The Singularity Film,” “The Singularity Is Near” and “We Are the Singularity.” One admiring critic wrote of “Transcendent Man,” “[The] film is as much about Ray Kurzweil as it is about the Singularity. In fact, much of the film is concerned with whether or not Kurzweil’s predictions stem from psychological pressures in his life.”

• Meanwhile, the debate continues over how soon will be the first and only coming of The Singularity (otherwise it would be named something like The Multilarity or perhaps just The Hilarity). At the Y, Paypal co-founder Peter Thiel once gave voice to his nightmare that The Singularity may take too long, leaving the world economy short of cash. Michael Anissimov of the Singularity Institute for Artificial Intelligence and one of the movement’s most articulate voices, warned that “a singleton, a Maximillian, an unrivaled superintelligence, a transcending upload”—you name it—could arrive very quickly and covertly. Vernor Vinge continues to say before 2030. (It didn’t arrive on Dec. 21, 2012, bringing a boffo ending to the Mayan calendar, as some had predicted.

• Science fiction writers continue to flee from the potential taint of having been believed to have authored the phrase, “the Rapture of the Nerds.” The Rapture, of course, is some fundamentalist Christians’ idea of a jolly good ending to the human adventure. Righteous people will ascend to heaven, leaving the rest of us behind to suffer. It’s probably the Singulatarians’ own fault that their ending sometimes gets mistaken for “those other people’s” ending. They can’t even talk about endings in general without “listing some ways in which the singularity and the rapture do resemble each other.”

• The Best and the Brightest among the Singulatarians don’t help much when they try to clear the air. For instance, there is this effort by Matt Mahoney, a plain-spoken Florida computer scientist, to explain why the people who are promoting the idea of a Friendly AI (an artificial intelligence that likes people) are the Don Quixotes of the 21st Century. “I do not believe the Singularity will be an apocalypse,” says Mahoney. “It will be invisible; a barrier you cannot look beyond from either side. A godlike intelligence could no more make its presence known to you than you could make your presence known to the bacteria in your gut. Asking what we should do [to try and insure a “friendly” AI] would be like bacteria asking how they can evolve into humans who won’t use antibiotics.” Thanks, Dr. Mahoney. We’re feeling better already!

• Philosopher Anders Sandberg can’t quit obsessing over the fact that the only way to AI is through the human brain. That’s because our brain is the only available working example of natural intelligence. And not just “the brain” is necessary but it will need to be a single, particular brain whose personality the great, incoming artificial brain apes. Popsci.com commentator Stuart Fox puckishly says this probably means copying the brain of a volunteer for scientific tests, which is usually “a half stoned, cash-strapped, college student.” Fox adds, “I think avoiding destruction at the hands of artificial intelligence could mean convincing a computer hardwired for a love of Asher Roth, keg stands and pornography to concentrate on helping mankind.” His suggestion for getting humanity out of The Singularity alive: “[Keep] letting our robot overlord beat us at beer pong.” (This is also the guy who says that if and when the AI of The Singularity shows up, he just hopes “it doesn’t run on Windows.”)

• Whether there is going to be a Singularity, and when, and to what ends does indeed seem to correlate closely to the personality of the explainer or predictor, whether it is overlord Kurzweil or someone else. For example, Vernor Vinge is a libertarian, who tends to be intensely optimistic, likes power cut and dried and maximally left in the hands of the individual. No doubt, he really does expect the Singularity no later than 2030, bar nothing. On the other hand, James J. Hughes, an ordained Buddhist monk, wants to make sure that a sense of “radical democracy”—which sees safe, self-controllable human enhancement technologies guaranteed for everyone—is embedded in the artificial intelligence on the other side of The Singularity. One has to wonder how long it will take for the Great AI that the Singulatarians say is coming to splinter and start forming opposing political parties.

• It may be that the penultimate act of the Singulatarians is to throw The Party to End All Parties. It should be a doozy. Because you don’t have thoughts and beliefs like the Singulatarians without a personal right-angle-to-the-rest-of-humanity bend in your booties. The Singularity remains an obscurity to the masses in no small part because the Singulatarians’ irreverence. Like calling the Christian God “a big authoritarian alpha monkey.” Or denouncing Howard Gardner’s popular theory of multiple intelligences as “something that doesn’t stand up to scientific scrutiny.” Or suggesting that most of today’s computer software is “s***”. No wonder that when the Institute for Ethics and Emerging Technologies was pondering speakers for its upcoming confab on The Singularity, among other topics, it added a comic book culture expert, the author of New Flesh A GoGo and one of the writers for TV’s Hercules and Xena, among other presenters.

All of the individuals quoted above and a lengthy parade of other highly opinionated folks (mostly males) who typically have scientific backgrounds (and often an “engineering” mentality) and who tend to see the world through “survival of the smartest” lenses are the people doing most of the talking today about The Singularity. It is a bewildering and ultimately stultifying babel of voices and opinions based on very little hard evidence and huge skeins of science-fiction-like supposition. I was about hit delete on the whole shrill cacophony of imaginings and outcome electioneering that I’d collected when I came across a comment from one of the more sane and even-keeled Singulatarian voices.

That would be the voice of Eliezer Yudkowsky, a co-founder and research fellow of the Singularity Institute.

He writes, “A good deal of the material I have ever produced—specifically, everything dated 2002 or earlier—I now consider completely obsolete.”

As a non-scientific observer of what’s being said and written about The Singularity at the moment, making a similar declaration would seem to be a great idea for most everyone who has voiced an opinion thus far. I suspect it’s still going to be awhile before anyone has an idea about The Singularity worth keeping.
_______________________
The above commentary has appeared in a blog on another of my websites. I’m choosing to recycle it here because I think the points it makes are fascinating and important.

Bookmark and Share