A Historian Talks About the Future
Are the Terminators Coming?
According to Yuval Noah Harari, the infamous history professor, philosopher, and author of bestselling books such as Sapiens: A Brief History of Humankind and Homo Deus: A Brief History of Tomorrow, the big challenges humanity will face are Nuclear War, Climate Change, and Technological Disruption (The rise of AI and biotechnology).
In this article, I provide you with an understanding of the ideas, concerns, and predictions Harari makes throughout his appearances on the internet. I limit myself to the topic of Technological Disruption since there is already plenty of content to cover. However, if you want to dig deeper, I provided links to the podcasts and interviews in the References section at the end.
The baseline of Harari’s thinking is that we are currently dealing with the wrong problems. Trade agreements, Brexit (the UK leaving the EU), and immigration are problems, but not as significant for humanity as the before mentioned big three. His argument goes as follows: We have a limited amount of energy and time, and governments and politicians use these resources in the wrong way.
“[…] in twenty years when we look back and ask why didn’t we stop climate change on time; why didn’t we regulate AI, we will say ‘we had this Brexit thing’, so we spend five years of attention on that”
In the domain of Technological Disruption, he further expands his argument that there is absolutely no regulation of the development of AI or biotechnology whatsoever.
About Free Will
Shockingly, Harari isn’t a strong proponent of free will. He contends that the idea of free will makes us uncurious. We don’t want to explore our desires. Therefore, we simply accept them as our ‘free will’. Yet, societies, big cooperations, or governments might shaped these desires. This form of ignorance, as you might imagine, makes us extremely easy to manipulate.
“When you make an important decision in life people often don’t stop to really try to understand ‘why do I choose this’, ‘where is this source of this desire?’”
On the Under The Skin podcast, Russell Brand asks him how to develop a different perspective on this “deep programming that exploits biochemical tendencies” Harari suggests that we should do more digging and exploring. Additionally, we should be much less certain about our opinion and thoughts. As Socrates allegedly once said, “To know, is to know that you know nothing”
Hack Humans Beings
“If we stick with the kind of egocentric and arrogant view of ourselves then it makes us extremely vulnerable of this new kind of technology”
One topic that comes up quite frequently in his appearances is that we soon might be able to “Hack Human Beings” It seems like this is one of the major concerns of Harari regarding Technological Disruption. The rise of unsupervised AI and biotechnology grants cooperations and governments the ability to surveil and manipulate us on a much bigger scale.
He often visualizes his argument with this story: In the Stalin era, the KGB couldn’t follow everyone in the Soviet Union. The amount of paper-based data processing required to monitor every one of their citizens was impracticable. However, now with the rising of computing power and a deeper understanding of the human body, we, for the first time in history, “could be in the position to hack human beings in a systematic and large scale”
“[…] If you don’t get to know yourself better there is somebody out there who is right now trying to hack you”
He further argues that cooperations such as Google, Amazon, and Coca-Cola are currently selling you products that you wanted, but soon they might be able to manipulate your desires completely. In a typical Apple manner, these cooperations, create a problem and then sell their solution to you. Yes, companies are already doing that all the time, however, these fixes aren’t individualized.
“What you try to do a thousand years ago with the priest preaching from the pulpit you will be able to do in a far more invasive way in 10 or 50 years […]”
This, plus what he said about free will, makes clear what his concerns are.
Can You Explain the Result of Your Algorithm?
Explainability is another concern of Harari. One of his favorite examples is that of a banker who can’t give you a loan. There are laws in place so that the bank has to explain why you can’t get it. However, what if the bank doesn’t know. The algorithm simply said so.
“Well, we have this algorithm and the algorithm went over masses of data. If you want, we can print you all the data, but we can’t make sense of it. We just trust our algorithm.” The thing is that if the algorithm made a decision in the same way as humans, we wouldn’t need it.”
Harari isn’t worried about destructive Terminators. It’s more likely that humans unknowingly give computers a lot of power. And once, he calls this point in time ‘singularity’, humans can’t understand the complexity of the algorithm’s results, there is no turning back.
Another story he retells quite frequently is one of the president of the United States of America. An algorithm predicts through data analysis that a financial crisis will occur in the next few years and therefore suggest further steps to eliminate the threat. However, the algorithm can’t explain why this will happen since humans can’t comprehend such massive amounts of data. But if the president doesn’t act, it will happen.
What should the president do?
Yuval Noah Harari addresses ideas and problems we will have to face. There is little doubt about that. His often dystopian world views might not come true, however, we should at least consider his arguments. I want to close this article with another thought-provoking quote:
“The basis for human success is not the truth it’s cooperation and it’s easier to cause people to cooperate with a fiction than it is with the truth”
- https://www.youtube.com/watch?v=gdeOMzY4Ur4 (Yuval Noah Harari & Russell Brand | Under The Skin #49)
- https://www.youtube.com/watch?v=87XFTJXH9sc (Yuval Noah Harari & Natalie Portman)
- https://www.youtube.com/watch?v=6hc6TNV6F-g (21 LESSONS FOR THE 21ST CENTURY with Yuval Noah Harari | The James Altucher Show)
- https://www.gatesnotes.com/Podcast/Why-do-we-believe-lies?WT.mc_id=20201130110000_PodcastLies_BG-TW_&WT.tsrc=BGTW (Bill Gates and Rashida Jones Ask Big Questions — Why do we believe lies?)
- https://www.youtube.com/watch?v=7yhg7NmTeVg (Daniel Kahneman and Yuval Noah Harari: ‘Global Trends Shaping Humankind’)
What You Can Learn About the Future by Yuval Noah Harari was originally published in Mind Cafe on Medium, where people are continuing the conversation by highlighting and responding to this story.