Reaching Singularity Might be Self Desructive

 

singularity

Promises of ‘immortality’ and a disease-free life have led many individuals to long for the hope of artificial intelligence and what is known as Singularity. It is essentially a merging of man and machine, the development of a ‘new species’ — a ‘borg’ of sorts. The subject recently made headlines when a major Russian scientist promised Singularity to the wealthy elite and ruling class by 2045 through the 2045 program, with artificial bodies available as early as 2015.

On the surface it may sound enticing to those who are willing to trust their new artificial brains and bodies hooked up to a massive super computer that has control over their every action (through the utilization of RFID-like chips).

Even the CEO of one of the largest and most well-known organizations known as the Singularity Institute for Artificial Intelligence admits, however, that the boom in artificial intelligence leading up to Singularity will not go very well for humans. The high-powered CEO admits that not only is the research on artificial intelligence outpacing the safety research that is intended to keep it in check, but that Singularity would actually make humans the ‘prey’ of sorts to the ‘super-human’ AI.

While doing an open Q&A on the community website Reddit, CEO Luke Muehlhauser explains that the superhuman AI would end up ‘optimizing’ the entire globe and starving resources from humans. In other words, the AI would suppress humans similar to the premise of iRobot or other similar works. This is particularly interesting when considering that artificial bodies and brains have been promised first to the wealthy elite by the 2045 program creator, allowing world rulers and the financial elite to achieve ‘immortality’ and subsequently a never-ending rule over the humans of the world.

Muehlhauser explains how humans would become a ‘prey’ to the ruthless ‘super-human’ AI with the completion of the Singularity:

“Unfortunately, the singularity may not be what you’re hoping for. By default the singularity (intelligence explosion) will go very badly for humans… so by default superhuman AIs will end up optimizing the world around us for something other than what we want, and using up all our resources to do so.

The concerns echo those put forth by researchers and analysts who have been following the concept of Singularity for decades. With the ultimate goal of linking all hyper-intelligent androids into a ‘cognitive network’ of sorts and eventually even forfeiting physical bodies, it’s clear that the Singularity movement even has its top supporters openly speaking out against it in many regards. What’s even more clear, however, is the fact that AI Singularity has no place for humankind — not even in a form of co-existence.

Via: naturalsociety

1 comment to Reaching Singularity Might be Self Desructive

  • wishbone.

    the only way to singularity is to look within, to remember i am the alpha, i am the omega, i am that, i am. or i was never born, i will never die, i have always been here in one form or other, i will always be here, i am an expression of the one infinite mind.

Leave a Reply

  

  

  


*

You can use these HTML tags

<a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>

sharethis_button(); }?>