The notion of a pending technological singularity, first explicitly
named as such by Vernor Vinge, is a modified version of the traditional
Abrahamic eschatology. Though it removes the notion of a supreme being
judging all those who have lived, it does promise believers a moment in
time beyond which all will live in perpetual paradise; immortality,
wealth, leisure, sexual prowess and
scholarly success are explicitly
promised to believers and, in some versions of the meme, denied to
others. The primary proponents of the Singularity belief are Raymond Kurzweil,
Vernor Vinge, Damien Broderick, Hans Moravec and Eliezer Yudkowsky.
Each presents a somewhat different vision of this, but they are all
organized about a common core.
The central tenet of the Singularity meme is the ever expanding power of
computer processors. Adherents point to charts indicating that
the number of calculations per second per $1000 has been doubling
every 2 years since the early 1890s and the Hollerith tabulating
machine. From human computers doing arithmetic for a $1 a day through
punchcards and vacuum tubes to transistors
and integrated circuits and on to DNA computers and
quantum computing. They say that every time a physical limit
that would stop the exponential curve is approached a new medium
is discovered that allows the curve to continue unchecked. At some
time in the (not-too-distant) future this curve goes vertical--ever expanding
computational capacity will result in ever more powerful technologies
being assimilated into society ever faster, the entire cycle spinning so fast
that we will no longer be able to project what is going to happen next.
This point of no return, when the pace of technological progress
goes infinite is the Technological Singularity.
While each of the main visions of the Singularity agree on the
above the point, they do exhibit significant differences. They
all differ on the specific technological features of the event
and they all envision significantly different worlds arising from
the event.
Vernor Vinge
The most moderate of the Singulatarians, Vinge is credited
with the creation of the idea in his novel Marooned in Realtime.
He later expanded upon it in a paper
delivered to a NASA conference in 1993. He argued that at some
point in the early 21st century human beings will create computer software
that is more intelligent than we are. Shortly thereafter this
software will produce software smarter than it. The human era will be over.
Vinge refrains, for the most part, from speculation as to the structure
of civilization following the singularity.
Damien Broderick
Broderick calls his Singularity The Spike, and has written a book by
that title. He doesn't focus on a particular technology, but contends
that the whole of science and knowledge are growing at an exponential
rate and that we will see unimaginable advances in every area of our
lives. He sees a rising tide for all of humanity as nanotechnology,
artificial intelligence, and advanced medical technology alleviate
all of humanity's misery.
Broderick's view is the most overtly utopic, promising immortality
through nanotechnology to everyone and beneficent artificial intelligences
acting as stewards of our race, managing everything from traffic to resource
production. Everyone will have the choice of living in a normal human body,
animating a custom biological or robotic shell, or existing as an
uploaded mind in a global computer network. Switching between
the three will be easy and, given the overall wealth of the post-Spike
world, cheap.
Raymond Kurzweil
Kurzweil's vision of the future has been set forth in books
such as The Age of Intelligent Machines, The Age of Spiritual Machines,
and the forthcoming The Singularity is Coming. He envisions a strong
singularity catalyzed by advances in our ability to simulate the activity
of the human brain. As functional magnetic resonance imaging becomes
more powerful he expects that we will completely reverse engineer the
functioning of the brain. After that is complete we will be able to
port our consciousness to any hardware that we wanted, thus achieving
immortality.
Like Broderick, Kurzweil sees a soft take-off Singularity. Rather
than a discontinuous, single-day change he sees a process that will
stretch through the whole of the 21st century (although this is still
abrupt by the standards of change in our global civilazation). Software
will match human intelligence and ability in first one area, then another and another. People will interact
primarily in virtual environments, even when several people are gathered
physically together the advanced display technology they use will render
most of their experience virtual. Medical technology will progress to the
point that people can keep their bodies alive indefinitely, but even if
something terrible, something unfixable, happens to that body it won't matter.
Uploading technology will be so good that immortality is all but guaranteed.
Hans Moravec
A researcher at Carnegie Mellon University, Moravec believes that
the Singularity will come about through the creation of advanced, autonomous
robots. He envisions a hard take-off Singularity where greater-than-human
level intelligence combined with a limitless ability to manipulate the
physical world through sophisticated robots will completely supercede
humanity. Moravec displays a strong believers only bias in his
Singularity scenario as it will be possible for those with the foresight
to have seen it coming to have ported themselves to new hardware that
can compete with the computers.
Eliezer Yudkowsky
Founder of the Singularity Institute and author of Coding a Transhuman AI
Yudkowsky has raised the Singularity to the level of pure religious object.
He believes in a particularly harsh version of the hard take-off, postulating
that human-level intelligence will rapidly bootstrap to something much, much
greater. This sudden increase in intelligence will give the growing AI the ability
to create strong nanotech if it doesn't already have it. Our continued existence
will be completely at the whim of this creature, and we are unable in principle
to determine what rules will apply after its arrival.
There is a strong theme of the end of humanity in Yudkowsky's writings.
He makes the argument that a more intelligent entity will be better
able to determine what is good and moral than we humans are since, of course,
it is more intelligent. If such an entity claimed that the proper thing to do
with humans was to destroy all of us then we should let it. After all, the
fact that we not only don't, but literally can't understand
the reasons for it shouldn't be relevant to whether such a genocide is the right
thing.
Given that he views it as unavoidable, Yudkowsky has devoted himself
to making it happen as soon as possible. He is trying to create a
bootstrapping artificial intelligence and to thus trigger his Singularity.
Bill Joy
Bill Joy is the most visible of the Singularity detractors. While
he believes that it is coming, he doesn't like it and wants it
stopped. He argued this point in a 2000 Wired magazine article, Why the Future Doesn't Need Us? He believes, much like Yudkowsky, that coming
technological changes are going to result in the end of humanity. To
avoid this Joy has, in effect, proposed the creation of an enormous police state
charged with preventing advances in certain fields of science by whatever
means are necessary.
While his written proposals fall short of explicitly sanctioning a police state and restrictions upon the freedoms of thought and speech, it is easy to see that this is what he proposes. He calls for "relinquishment" of robotic, biological, and nanoscale technologies and research on a global basis are unrealizable in the absence of a global police state or in the presence of freedoms of speech, press, or assembly (essentially in the presence of the freedom to communicate). For Joy's program to be successful it is not enough that the majority of the world's people cease research and development in these areas. If even a small number of people continue to work in these fields the breakthroughs in our knowledge that Joy fears so greatly will come. To implement Joy's relinquishment strategy will require that these people be identified and prevented from performing their research.
Like nuclear non-proliferation, this is a policy that is guaranteed to fail eventually. Once the apple of knowledge has been bitten no amount of brutality or repression will be able to undo the effects. Even in the most brutal police state imaginable relinquishment is an impossible goal, and to advocate or pursue it is to endorse the destruction of countless lives for no long-term benefit.