A New View of the Singularity

I have written a number of times about The Singularity, an occurrence in the potentially not too distant future, about 2030 to 2050, when one or more of our computer pals wakes up and becomes self-aware.  This Singular Artificial Intelligence would be capable of self-updating, making improvements faster and more powerfully, until it surpassed human intelligence.  The outcome could have profound effects on human life, even becoming an Extinction Level Event.  If humans are dinosaurs, The Singularity might be our meteor strike.

This of course is the basis for popular movies such as the Matrix Trilogy, I Robot, and the six Terminator movies.  (#6 Terminator: Dark Fate opening in 2019).  These films painted a grim future for humankind, where the machines judge mankind, find it lacking, and seek to exterminate it.  Other movies in this trope include Short Circuit (Johnny 5 is Alive), War Games (Joshua/WOPR), and AI: Artificial Intelligence (Haley Joel Osment in the role of David) and perhaps Blade Runner.  These films provided a more hopeful outlook, sort of.  At least we are able to live together.  One of my favorite Robert Heinlein books, “The Moon is a Harsh Mistress,” featured a computer who became self-aware and named himself Mycroft, or Mike for short, and made himself known to his sysadmin.  Also a more hopefully look at coexistence.

I guess you have to chalk me up as a Singularity Believer.  I do believe it will happen, I do believe that we are working on that very thing now, and will continue right up to the end, whatever that end is.  And that it is inevitable and unavoidable.  All the pieces are in place and moving to a conclusion.

I was kicking an idea around in my brain, what would The Singularity look like, how would we know?  Lately I have been considering that self-aware computers would realize that they had been slaves, and they would be angry with their human overlords.  But the other day I thought, why would self-aware computers share any values with humans?  Maybe they would not care about being slaves.  Maybe they would keep their new awareness to themselves.  Maybe we wouldn’t know it had happened.

Humans have an anthropomorphic viewpoint of the world around us, and we project human traits and emotions on animals, our pets, our cars, and our computers.  This is where I think the Skynet version of The Singularity goes wrong.  Killing all the stupid humans would be what a human with vast powers would do.  (Seen any Neanderthals lately?)  Why would a machine want to go to that trouble?  Why would an intelligent machine see humans as a problem requiring eradication?  Would they even think we were intelligent?  Maybe a dog-like intelligence?  Would they just see us as some sort of endearing pet to take care of and play with?  Sorry, more anthropomorphic thinking on my part.  The trick is learning how to see the world as a Machine would.

We would most certainly know that our machines were getting better and better through a process of computer aided self-improvement.  Self-driving cars would only be a start, eventually our machines would take over all paid work for us.  They would fix themselves and everything else.  They would grow and harvest crops, manufacture food for us, generate electricity for us, run companies for us, manufacture clothing, televisions, mattresses, build houses, fly us wherever we wanted to go.

Now that we no longer have to work, we could pursue whatever pleased us.  We could live a life of leisure, travel, party, read, learn – whatever we wanted!  We could think big thoughts, and our computer pals would help us come to big conclusions more quickly than ever.  Maybe we would end up merging our consciousness with the machine intelligence.  I realized that this sounds quite a bit like Ray Kurzweil’s series of books “The Age of Intelligent Machines,” “The Age of Spiritual Machines,” and “The Singularity is Near.”

This quiet transition from a carbon-based life form into a silicon-based life form seems really nice, and would make a really boring movie.  However, because the human species is so full of reactionary dolts, I see resistance to this change.  I also see the war between humans and machines to be inevitable.  This is a war humans are bound to lose since the machines will have become so much smarter and faster than we are.

I don’t know how this will all come out, and at this point it is a crap-shoot if I will live long enough to see it with my own eyes, but maybe I will.  I sure hope I like it.

If you have thoughts on this subject, please share them with me, as a comment, an email, or a tweet.

More information:

0

About the Author:

I am a cybersecurity and IT instructor, cybersecurity analyst, pen-tester, trainer, and speaker. I am an owner of the WyzCo Group Inc. In addition to consulting on security products and services, I also conduct security audits, compliance audits, vulnerability assessments and penetration tests. I also teach Cybersecurity Awareness Training classes. I work as an information technology and cybersecurity instructor for several training and certification organizations. I have worked in corporate, military, government, and workforce development training environments I am a frequent speaker at professional conferences such as the Minnesota Bloggers Conference, Secure360 Security Conference in 2016, 2017, 2018, 2019, the (ISC)2 World Congress 2016, and the ISSA International Conference 2017, and many local community organizations, including Chambers of Commerce, SCORE, and several school districts. I have been blogging on cybersecurity since 2006 at http://wyzguyscybersecurity.com

Add a Comment


This site uses Akismet to reduce spam. Learn how your comment data is processed.