“Never trust a computer you can’t throw out a window.” – Steve Wozniak, co-creator of the Apple computer
I just started reading a book from 1999, written by Ray Kurweil called The Age of Spiritual Machines. This book explores the consequences of our technological changes as they relate to AI or Artificial Intelligence, sometimes called “the Singularity,” and the future of the human race in the age of self-aware machines. We have all see the “Terminator” movie series, with its distopian view of a future where machines become self-aware and quickly move to exterminate the human race. So that is one plausible outcome. Or the “Matrix trilogy”, where we, where we are stored in gooey capsules and dream about a life that we live only virtually, while providing heat and electricity to the machine overlords. As I was thinking about the inevitability of a not very distant future at all, where machines start to improve upon themselves, and where slow thinking carbon based intelligence is outpaced by much quicker silicon based intelligence. Would we just be in the way? I casually mentioned to my spouse of many years that my hope is that the machines may keep us around as pets.
So imagine my surprise to run across an article on Sophos suggesting the same idea, as broached by both Steve Wozniak and Elon Musk in recent but separate interviews. That this quiet hope is being suggested by two technological luminaries is interesting.
According to Ray Kurzweil’s book, this should all happen by 2020, which is right around the corner. This is a bigger issue than human genetic engineering, or climate change, or nuclear proliferation, or terrorism, and yet is it getting very little public play. The scary part for the future of humans is that in the history of the planet so far, the more technologically advanced life forms have generally eliminated or subdued the technologically less sophisticated forms. For instance, where are the Neanderthals? Got one one your block? Not really, because our Cro-magnon ancestors were smarter, and had better tools, and either competed better for resources, or just killed off their less gifted rivals. What about Columbus and the indigenous Caribbean population. How about Cortez and the Aztecs? So how will it be for us? After the smart machines take over businesses and lay off all the slow humans, will they bother to continue to produce and distribute food for us? Will they see us as a nuisance? Or a threat?
Big questions, and even a slow carbon-based life form can see the problems that could happen when our tools become smarter than we are. Can this be stopped or even controlled? The Future of Life Foundation has an interesting open letter addressing this issue, that has been signed by many technical luminaries including Steven Hawking. Go ahead and add your name to the list.
So take a minute to click through to some of these resources, and see where you come down on this issue.
About the Author:I am a cybersecurity and IT instructor, cybersecurity analyst, pen-tester, trainer, and speaker. I am an owner of the WyzCo Group Inc. In addition to consulting on security products and services, I also conduct security audits, compliance audits, vulnerability assessments and penetration tests. I also teach Cybersecurity Awareness Training classes. I work as an information technology and cybersecurity instructor for several training and certification organizations. I have worked in corporate, military, government, and workforce development training environments I am a frequent speaker at professional conferences such as the Minnesota Bloggers Conference, Secure360 Security Conference in 2016, 2017, 2018, 2019, the (ISC)2 World Congress 2016, and the ISSA International Conference 2017, and many local community organizations, including Chambers of Commerce, SCORE, and several school districts. I have been blogging on cybersecurity since 2006 at http://wyzguyscybersecurity.com