On the Edge of the New Era

Written by:

Share

Facebook
Twitter
LinkedIn
Pinterest
Email
Print

I am going to put on my futurist hat and project 100 years into the future and make a really far-out prediction. I predict that within a hundred years, or perhaps even fewer, the timeline of history is going to be turned on its head and will be rewritten.

Right now, we are living through the second of the two biggest macro-changes in recording the history of mankind on this planet. The first great change was signified by the establishment of our current historical dating system: BC and AD (Before Christ and Anno Domini-year of the Lord), using the birthdate of Christ and the formation of Christianity as the fulcrum of history, from ancient times to today. This is also now referred to as BCE and CE (Before Common Era and Common Era), a more secular adaptation.

In the scheme of mankind\’s history, Christianity changed everything. It changed religious practice, society and culture, politics, daily life and human relationships, and set the stage for the first real World War-the Crusades. It established a world headquarters of doctrine and affairs in Rome (after an earlier headquarters in Constantinople), with a titular head, the Roman Catholic Pope. Christianity and Catholicism eventually spread throughout all corners of the globe, including the New World. The Christian religion was a principle driver for the creation of art and architecture, global trade, the build-up of armies, exploration of far-flung geographies, books and the printing press, and new fields of study, which expanded exponentially during the Renaissance, paving the pathway to advancements of science and technology in the modern era.

BDE/DE: Before Digital Era/Digital Era

Now, we are entering the second great monumental epoch for mankind-the Digital Era. This has already been dramatic and far reaching, just in the few years since the computer and the internet went mainstream.

The precursor to the Internet was the ARPANET, a noncommercial interconnection of academic and military networks. An article in The Wall Street Journal, \”Life as We Know It Turns 50,\” by Andy Kessler, 12/2/18, refers to the anniversary of the 1968 Joint Computer Conference, where Doug Engelbart gave a crude demonstration of hypertext, video conferencing, teleconferencing and a network operating system. Strangely enough, his presentation was titled, \”The Mother of All Demons.\” The first internet transmission was sent soon after, with a message from a UCLA lab to the Stanford Research Center. Most experts attribute widespread public access to the internet as having been initiated on April 30, 1993, now just past the 25-year mark. On that date, the European Organization for Nuclear Research (CERN) put the Web into the public domain. (Of course, the history of the computer goes back a bit further in time, including Alan Turing\’s famous Enigma machine, designed to break enemy codes during World War II; and even more nascent beginnings in 1822 with Charles Babbage\’s Difference Engines, the first mechanical calculators.)

The past 50 years have flown by so fast, triggering so many head-spinning changes that it feels like we have barely been able to keep up. Indeed, many of us are constantly driven by FOMO, fear of missing out on the latest technology. By the end of the next century, the difference between the then-contemporary world prior to the Digital Era, will be mind-shattering-perhaps as different as the 20th Century was from the Paleolithic period, when stone tools were first invented.

Going forward, the Digital Era promises even more sweeping and pervasive changes that will be far more all-encompassing in the lives of humans -with the potential merger of man and machine. So, perhaps it is not so far-fetched to predict that a redefinition of man\’s historic timeline on the planet will be a divide between BDE and DE, Before Digital Era and Digital Era.

Anticipating the Technological Singularity

The divide between historic eras will become undeniably apparent with the emergence of the Technological Singularity, \”the hypothesis that the invention of artificial superintelligence (ASI) will abruptly trigger runaway technological growth, resulting in unfathomable changes to human civilization.\” Singularity predictor, Ray Kurzwell, director of engineering at Google, wrote in 2005, that he saw this happening no later than the year 2045. Kurzwell stated that \”by 2020, $1,000 will buy the computational power equal to a single brain,\” and \”by 2045, with the arrival of the Singularity, $1,000 will buy one billion times more power than all human brains combined today.\”

We are already deep into Artificial Intelligence (AI), essentially computers that learn and teach themselves, therefore expanding their capabilities. However, most AI today is dependent on algorithms, programed for specialized, limited tasks. Businesses see the biggest potential of AI in its capacity for predictive analytics: using past data to predict outcomes and what consumers will want in the future. As AI becomes more powerful, there is debate as to whether human and AI philosophies and goals will align, or whether AI will eventually dominate and control humans. In fact, many scientists and IT engineers believe that AI, once fully unleashed, may become impossible for humans to control. Renown physicist Stephen Hawking said the emergence of Artificial Intelligence \”could be the worst event in the history of our civilization unless society finds a way to control its development.\”

With AI\’s help, humans are already looking forward to a world filled with self-driving cars, drone taxi cabs, smart homes that run themselves, 3D printed goods, and robots that wait on us hand-and-foot. Looking a bit further into the future, here are some other scenarios we might see, including the merger of man and machine:

  • Brain implanted microchips that facilitate immediate, perpetual connection to the internet and instantaneous personal communication over distances, eliminating the need for physical mobile devices.
  • The development of vocal translator implants for pets that will let cats and dogs tell their owners when they are hungry, when they want to play, or when they do not feel well.
  • The emergence of surgically augmented humans: artificial limbs that facilitate high-speed running and robotic hands that can crush rocks; eye implants that allow instantaneous adjustments for various magnifications as well as night-vision, thermal imaging; CIA spy-level super-hearing ear implants, etc. Ordinary human abilities may seem obsolete in comparison, as these digital enhancements become as common as cosmetic surgery procedures are today.
  • As an option to surgical augmentation, special protective and ability-enhancing exoskeletons, made of lightweight, impenetrable materials, will be worn by those in hazardous occupations: the military, first responders, miners, explosives technicians, etc.
  • The implantation of multiple sensors throughout the human body that will constantly report on the health and functioning of every organ and biological process, alerting when there is a problem or the need to seek medical attention.
  • Surgery by nano-robots that can rearrange cells at the molecular level, identifying damaged or diseased tissue and toxins and encapsulating and removing such materials from the body without surgery, providing cures for many diseases, including cancers.
  • AI cyber-security bots that will constantly monitor all internet connections around the globe, instantly eliminating any attempts at security breaches, hacking or malware-while also monitoring all user behaviors and activities, on the lookout for criminal activities or other deviations from the norm, thus significantly reducing the crime rate.
  • Many of the masses may live in small, modest \”cubicle homes\” in the real world, while living in virtual reality fantasy homes online in palaces or mansions, filled with replicas of the finest furnishing, clothing, and other accoutrements. They may spend many hours [practically living] in these virtual environments, interacting with friends and lovers, using avatars, and playing customized videogames.

Pandering to, and Pampering Consumers

In this emerging new AI world, we can only speculate about the roles of consumers. Will consumers still be in control, selecting what they want and making buying decisions, either online or in stores? Or will AI, with its access to limitless data about what every individual human being is thinking and doing at all times, decide that it knows better which goods and services each person should have. Will we transition from being selective and discerning consumers, to receiving dictates about what we should buy-what we are allowed to, or directed to buy-with goods shipped to us when it is determined that we should have them, leaving us to simply pay the bills?

The role of consumers is already drastically changing. For example, a 20-something man, sitting on his sofa at home, orders dozens and dozens of menu items from an endless array of restaurant offerings-all delivered to his door in minutes. Or the young woman at home, is delighted when her new car is brought right to her door (she could have selected it from a vending machine) all without inspecting or test driving the car.

Consumers are being trained to leave more and more of the shopping process to others, as long as they think they are in control. Like privileged potentates, they sit and wait to have their every desire, whim and command immediately fulfilled. Are we in danger of developing a huge consumer base of spoiled, lazy couch potatoes who ultimately defer all decisions about their lives to a digital third party?

It seems retailers may be counting on just that, and they want to know a lot more about their customers. We should note that retail is spending more on AI systems than any other economic sector-$5.9 billion in 2019, according to The Wall Street Journal and International Data Systems. Total annual investment in AI systems is $35.8 billion. But according to a recent survey by SmarterHQ, 79 percent of respondents think that companies already know too much personal information about them-yet consumers do little or nothing to push back.

How long before the click-on selection button goes away, and consumers just eat or drive or wear whatever is delivered to them, based on complex neural network interfaces with machines? When AI automatically stocks the refrigerator, books that vacation, sells and buys homes, schedules medical appointments, drives the car, and even controls the temperature of bath water, what is left for humans to do? So all-subsuming is this technology that we are fast approaching the time when consumers will not be able to know how to do any of these activities without digital dependence.

What happens if AI is eco-friendlier and more economical than we humans are? What if AI decides that it will be best if we all drive one energy-efficient model car (after all, that\’s not a new idea, Hitler promoted a \”people\’s car\” in the 1940s, intending it to be more economical and to aid in erasing social boundaries-it was called Volkswagen). Or AI could decide that instead of wasting manufacturing materials and time, turning out thousands of styles and designs of apparel, that everyone should simply wear the same thing-AKA a uniform? Or in a Draconian scenario, there is little justification for the lives of millions of people with IQs under 110, and they should be done away with? The science fiction worlds of the past have an uncanny resemblance to the realities of today.

We have no guarantee that if AI becomes superior in intelligence to mankind, it will understand emotions like loving, caring, empathy, sympathy, charity, beauty, or the desire to differentiate one\’s self from others via distinction in our manner of dress, the homes we live in or the cars we drive.

The Digital Era Has More Unknowns than Knowns

This new DE era has the promise of being filled with wonders and benefits-but it also carries the risks of unpredictability, incompatibility, and ultimately, even oppression or subjugation through algorithmic bias.

We have to believe that humans will be clever, resourceful and ethical enough to keep a step ahead of AI, to never lose sight of what it means to be human and value social connectedness, concern for our fellow man, fairness in the face of adversity, freedom of personal expression, and the pursuit of spontaneity and joy.

Related

Articles

Scroll to Top
the Daily Report

Insights + Interviews right to your inbox.

Skip to content