Thursday, June 04, 2009

Why "The Singularity" Scares the Bejesus Out of Me

More and more, I am seeing research and advancements that suggest that Ray Kurzweil isn't as far off as I would like to believe he is with the whole singularity thing. For those who don't know about this, here is a definition of the technological singularity that Kurzweil promotes.

The technological singularity is the theoretical future point which takes place during a period of accelerating change sometime after the creation of a superintelligence.[1]

In 1965, I. J. Good first wrote of an "intelligence explosion", suggesting that if machines could even slightly surpass human intellect, they could improve their own designs in ways unforeseen by their designers, and thus recursively augment themselves into far greater intelligences. The first such improvements might be small, but as the machine became more intelligent it would become better at becoming more intelligent, which could lead to an exponential and quite sudden growth in intelligence.

In 1993, Vernor Vinge called this event "the Singularity" as an analogy between the breakdown of modern physics near a gravitational singularity and the drastic change in society he argues would occur following an intelligence explosion. In the 1980s, Vinge popularized the concept in lectures, essays, and science fiction. More recently, some prominent technologists such as Bill Joy, founder of Sun Microsystems, voiced concern over the potential dangers of Vinge's singularity.(Joy 2000)

Today I just came across an article about a join effort between Google and NASA to develop such a technological advancement. Here is the article, which is reporting on the new Singularity University.

NASA & Google Join Forces to Research Singularity -the "Intellegence Revolution" (VIDEO)

Post-02-04-09-su2 It is the best of times. Anyone who complains about science not delivering it's promises simply doesn't comprehend how incredible this information age truly is: you can go to the mall RIGHT NOW and buy devices which would have reshaped the world ten years ago, are reshaping it today, and technology isn't slowing down - it's accelerating exponentially. There are incredible innovations just around the corner and that's the thinking behind the creation of Singularity University..

An advanced academic institution sponsored by leading lights including NASA and Google (so it couldn't sound smarter if Brainiac 5 traveled back in time to attend the opening ceremony). The "Singularity" is the idea of a future point where super-human intellects are created, turbo-boosting the already exponential rate of technological improvement and triggering a fundamental change in human society - after the Agricultural Revolution, and the Industrial Revolution, we would have the Intelligence Revolution

Real AI effects are closer than you might think, with entirely automated systems producing new scientific results and even holding patents on minor inventions. The key factor in singularity scenarios is the positive-feedback loop of self-improvement: once something is even slightly smarter than humanity, it can start to improve itself or design new intelligences faster than we can leading to an intelligence explosion designed by something that isn't us.

The Singularity University proposes to train people to deal with the accelerating evolution of technology, both in terms of understanding the directions and harnessing the potential of new interactions between branches of science like artificial intelligence, genetic engineering and nanotechnology.

Inventor and author Raymond Kurzweil is one of the forces behind SU, which we presume will have the most awesomely equipped pranks of all time ("Check it out, we replaced the Professor's chair with an adaptive holographic robot!"), and it isn't the only institutions he's helped found. There's also the Singularity Institute for Artificial Intelligence whose sole function is based on the exponential AI increases predicted. The idea is that the first AI created will have an enormous advantage over all that follow, upgrading itself at a rate they can never catch up on simply because it started first, so the Institute wants to work to create a benevolent AI to guard us against all that might follow.

Make no mistake: the AI race is on, and Raymond wants us to win.

This is all very interesting and may well produce some technology that improves our lives for the better - but to be honest, this shit scares the bejesus out of me.

As near as I can tell, reaching the singularity puts our society at risk of technological disaster. When you have a few people evolved enough to create that technology (the cutting edge of human development in the intellectual/cognitive line), you have many more people still at lower moral developmental stages (possibly including the inventors themselves - there is no reason to believe that cognitive development is paired with moral development) who would use that technology to generate power and wealth, which is necessarily an egocentric drive.

We could seriously end up with an elite class who have access to the technology to extend life, avoid illness, increase intelligence, and so on - but those people won't likely be the most developmentally advanced people, nor the most morally developed or compassionate people. It will be the richest people.

So the downside is that there won't be compassionate use of the technology, but rather egocentric use, which will likely lead to catastrophe.

It's the same principle as allowing tribal/egocentric cultures to acquire nuclear weapons - there is NOT the developmental maturity to handle the technology.

Suppose, as Kuzweil has argued, that nanobots become a reality in healing disease and extending life in the next couple of decades. Who will be able to afford these things? Not me, and likely not you. Only the richest of people will have access to this technology. How will this impact our culture?

And suppose that not only do these nanobots prolong life and eliminate disease, maybe they increase intelligence or strength. Do we then have a "super race" of techno-humans, or transhumans? And if (or when) we do, how do we deal with this?

Where is the moral intelligence to deal with the technological advances we see escalating at an ever increasing rate? How do we prevent these advances from being weaponized?

Where is the spiritual intelligence to make sense of this in a world where the divide between rich and poor is greater than it has ever been, where the divide between those with an egocentric and worldcentric worldview is totally in favor of the lesser, more power-oriented worldview, and not the greater more compassionate worldview?

Is anyone else even asking these questions?


ninjaclectic said...

You're right on.

Technology is becoming invasive and powerful enough to bring humanity to the brink of entirely new levels of destruction...

we are building tools that not only manipulate the external material world, but also the internal universe of mind and consciousness.

It is indeed a wonderful but profoundly scary time.

I thouhgt Susan Blackmore's TED talk was an interesting take on some of the territory we now find ourselves in:

Lodro Rigdzin said...

and all this is why developing bodhicitta is of the utmost importance, therefore, to me, it is no coincidence that the dharma is spreading at the rate it does.

Dan Allison said...

I see where you're coming from here, but I'm not worried about it in the least. I don't buy the idea that only the rich will have access to these technologies. That may be the case at the very beginning, but not for long. A few years ago, only the Gordon Geckos of the world had cell phones. Today, cell phones are available even to people in third world countries. Besides, I think, by and large, rich people are more compassionate than you give them credit for. Rich people are humans too, after all. And even if they are just a bunch of egocentric greed muffins, I don't think rich people are smart enough to figure out how to prevent these technologies from eventually reaching the masses. They're only human, after all. And even if they did figure out a way to keep it all to themselves, I don't think poor people are dumb enough to let that happen for very long.

I guess it comes down to the fact that I have faith in the overall goodness of the human race. And even if I didn't, I think that this technological evolution is bigger than us humans. It isn't about us, and I don't think we're intelligent enough (yet) to understand what is really happening here.

Cherie Beck said...

I have spent the better part of the last four months integrating into my worldview aspects like Kurzweil's Singularity in an attempt to navigate the complex world of now, with eye toward understanding what to do to shape the future in the face of the massive amount of change that we are living through. The worldview I now have, is made possible because of the last 10 years of exploration...and is built upon of a comprehensive set of frames including integral theory, spiral dynamics, culture evolution and change. AND it must include a commitment to spiritual embodiment that at a minimum is open heartedness, at a maximum is nothingless than total dna change of our physical, mental and spiritual form.

From this worldview, Kurzweil's singularity is VERY partial. In fact I highly recommend Nassim Haramein's work found at for an entire reframing of singularity. His work outlines a compelling understanding that EVERYTHING is a singularity, or a black hole. We are in fact living in one. The when singularity is reframed understanding the principles of creation (as Nassim describes), you'll be quick connect the spiritual wisdom of the ages with the singularities.

Technology advancement is changing our intelligence and our is the dynamic and exponential rate of technological change. Technology is a key to our survival...but grander technological development than is being claimed in the mainstream media today.

Coherence is the name of the game. Those things (lifeforms and/or technology) that are in coherence with the structure and principles of creation itself, will survive and will become the basis for life on this planet in the future.

I've been told it will be like going through the spin/dry cycle of a washing machine. The feedback loop of our individual and thus collective choices will be shorter and shorter. And the shortening of that feedback loop happening very fast. Like the skater who spins faster by bringing her arms closer to her body. That's the singularity to engage. :)