My Thoughts on “The Singularity Institute for Artificial Intelligence”


The SIAI is strange to me ( It is an organization that, on their website, state that they have the following goals:

“Our Goals
For almost a decade, the Singularity Institute has been asking questions on the future of human civilization: How can we benefit from increasingly powerful technology without succumbing to the risks, up to and including human extinction? What is the best way to handle artificial general intelligence (AGI): programs as smart as humans, or smarter?

Among SIAI’s core aims is to continue studying “Friendly AI”: AI that acts benevolently because it holds goals aligned with human values. This involves drawing on and contributing to fields like decision theory, computer science, cognitive and moral psychology, and technology forecasting.

Creating AI, especially the Friendly kind, is a difficult undertaking. We’re in it for as long as it takes, but we’ve been doing more than laying the groundwork for Friendly AI. We’ve been raising the profile of AI risk and Singularity issues in academia and elsewhere, forming communities around enhancing human rationality, and researching other avenues that promise to reduce the most severe risks the most effectively. ”

And in a recent e-mail from them, they wrote:

“Dear Friends of SIAI,

I’d like to share with you my picture of SIAI’s most recent year, of the global situation we are addressing, and of the work we can do with your help.

To put it bluntly, we are a small group of intelligent, ambitious, but as yet mostly inexperienced people who are working to increase the odds of an eventual positive Singularity and to decrease the odds of human extinction.”

Now when I see that, I just think, “What the hell?”. I mean, seriously, it is scary that these people are actually believing that we need to worry about human extinction caused by artificial intelligence. Maybe, MAYBE at a MUCH later stage. But at this point in time, a little child with a gun is much more dangerous than any computer on the planet.

“…to increase the odds of an eventual positive Singularity and to decrease the odds of human extinction.” – The Terminator range is one of my favourite movies, and the Sarah Connor Chronicles is brilliant. But, people – we can use the money that you raise for many better purposes than to ‘try to stop impending doom from artificial intelligence”.

Readers – please comment. What do you think?


2 thoughts on “My Thoughts on “The Singularity Institute for Artificial Intelligence””

  1. Recently I read more about the brain, computers and intelligence, and I have two things that I would like to point out.

    a) It is currently impossible to build a truly intelligent computer. When I say “intelligent” I mean as humans, not as something with great computation power and speed. Simulating a brain is far more complex than randomly connecting a few neurons together. Computers and brains are fundamentally different.

    b) If, however, we do discover the secret and find a way to build an electronic brain, then humans will become to this new species what rats are to humans. Really, I am not joking.

    Either way, spending time and money to prevent some “impending doom” is a waste of money.

    But I am still reading… Who knows what the next step of “evolution” will be. As long as they are pretty and I get to run around in one of their “rat”-mazes. 🙂

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s