The SIAI is strange to me (http://www.singinst.org/). It is an organization that, on their website, state that they have the following goals:
For almost a decade, the Singularity Institute has been asking questions on the future of human civilization: How can we benefit from increasingly powerful technology without succumbing to the risks, up to and including human extinction? What is the best way to handle artificial general intelligence (AGI): programs as smart as humans, or smarter?
Among SIAI’s core aims is to continue studying “Friendly AI”: AI that acts benevolently because it holds goals aligned with human values. This involves drawing on and contributing to fields like decision theory, computer science, cognitive and moral psychology, and technology forecasting.
Creating AI, especially the Friendly kind, is a difficult undertaking. We’re in it for as long as it takes, but we’ve been doing more than laying the groundwork for Friendly AI. We’ve been raising the profile of AI risk and Singularity issues in academia and elsewhere, forming communities around enhancing human rationality, and researching other avenues that promise to reduce the most severe risks the most effectively. ”
And in a recent e-mail from them, they wrote:
“Dear Friends of SIAI,
I’d like to share with you my picture of SIAI’s most recent year, of the global situation we are addressing, and of the work we can do with your help.
To put it bluntly, we are a small group of intelligent, ambitious, but as yet mostly inexperienced people who are working to increase the odds of an eventual positive Singularity and to decrease the odds of human extinction.”
Now when I see that, I just think, “What the hell?”. I mean, seriously, it is scary that these people are actually believing that we need to worry about human extinction caused by artificial intelligence. Maybe, MAYBE at a MUCH later stage. But at this point in time, a little child with a gun is much more dangerous than any computer on the planet.
“…to increase the odds of an eventual positive Singularity and to decrease the odds of human extinction.” – The Terminator range is one of my favourite movies, and the Sarah Connor Chronicles is brilliant. But, people – we can use the money that you raise for many better purposes than to ‘try to stop impending doom from artificial intelligence”.
Readers – please comment. What do you think?