Skip to content
Who's in the Video
Yuval Noah Harari, born in Israel in 1976, is a historian and philosopher renowned for his bestselling books, including Sapiens: A Brief History of Humankind, Homo Deus: A Brief History[…]
Sign up for Big Think on Substack
The most surprising and impactful new stories delivered to your inbox every week, for free.

Yuval Noah Harari sees one glaring problem at the heart of our quest for an AI-led world: The incompatibility of humans and the always-on information cycles. 

Harari cautions that organic beings trying to force themselves into these 24/7 inorganic cycles will eventually fail, and bring about their own destruction. 

His solution? Clear delineation of inorganic information, and a strict diet for your consumption.

YUVAL NOAH HARARI: If humans are so smart, why are we so stupid? We have managed to reach the moon, to split the atom to decipher DNA, and yet with all our knowledge and wisdom, we are on the verge of ecological collapse, perhaps of a third world war. And also we are developing an extremely powerful technology, AI, which might get out of our control and enslave or destroy us. Before we think about it in terms of risks or threat or opportunity, just think what it would mean if we increasingly live our lives cocooned inside the cultural artifacts coming from an artificial intelligence. There is so much hype around AI, especially in the market. If you want to sell something to people today, you call it AI. Not every automatic machine is an AI. What makes AI AI is that it is able to learn and change by itself and come up with decisions and ideas that we don't anticipate, can't anticipate. The acronym, AI, it's more accurate to think about it as an acronym for alien intelligence. With every passing year, AI is becoming less and less artificial and more and more alien in the sense that we can't predict what kind of new stories and ideas and strategies it will come up with. And it thinks, it behaves in a fundamentally alien way. This is AI and this is not just theory, it's also we are seeing it all around us. I'm Yuval Noah Harari. I'm a professor of history at the Hebrew University of Jerusalem and the author of "Nexus: A History of Information Networks From the Stone Age to AI". Whenever a new technology is invented, it changes all the social and economic and political structures, and it often does so in a non-deterministic way. For instance in the 20th century, the rise of mass media and mass information technology like the telegraph and radio and television, on the one hand, they now form the basis for large scale democratic systems, and on the other hand, for large scale totalitarian systems. Before the rise of modern information technology, ancient kings in Mesopotamia or Roman emperors or Chinese emperors, they had a very limited capacity to collect information. They could not micromanage the social and economic and cultural lives of every individual in the country. They didn't have the information necessary to do it. All information technologies up to the 21st century were organic networks based on our organic brain. We run by cycles. Sometimes it's day, sometimes it's night. There is winter and summer. There is growth and decay. There are times for activity and then there are times for sleep and for rest. Think about Wall Street. Wall Street also obeyed, until today, this organic logic. The market is open only Mondays to Fridays and then the weekend is off. And this is how organic beings function. Even bankers and investors and financiers, as long as they're humans and not algorithms, there is always time off and there is always private time. Until the rise of AI, even the most totalitarian regimes, like in the Soviet Union, they could not monitor, they could not surveil everybody all the time. The Soviet Union did not have enough KGB agents to follow every Soviet citizen 24 hours a day. And even if you somehow manage to follow all the people all the time, they didn't have enough analysts to go over all the information and make sense of it. So organic information networks, they always run by cycles, there is always time to rest, and there is always a measure of privacy. But we now see the rise of a new type of information network which is inorganic, which is based on AI. It need not have any breaks. They are always on and therefore they might force us to be always on, always being watched, always being monitored. And this is destructive for organic animals like ourselves. If you force an organic being to be on all the time, it eventually collapses and dies. And we see it happening all around us with a 24 hours new cycle that never rests, the markets never rest, politics never rest. Anything you do or say at any time might be watched and recorded. It can meet you down the line 10 or 20 years in the future. So basically the whole of life is becoming like one long job interview. Now, all this is made possible by the fact that AI is the first technology in history that can take decisions by itself. Until today, all our big information networks, they were managed, they were populated by human bureaucrats. A curious fact is that at least in the United States, there is already a legal path open for AIs to become legal persons. Corporations are considered legal persons that even have rights like freedom of speech. So for instance, it can open a bank account. Corporations open bank accounts, why can't the AI do it? It's a corporation. It can earn money and then it takes its money and invest it. And because it's so good at making investment decisions, it earned billions and billions. We could be in a situation when the richest person in the United States is not a human being. The richest person in the United States is an incorporated AI. So the legal and practical path to this situation is open. So what we are facing is not, you know, like a Hollywood science fiction scenario of one big evil computer trying to take over the world. No, it's nothing like that. It's more like millions and millions of AI bureaucrats that are given more and more authority to make decisions about us in banks, in armies, in governments. And again, there is good potential in that as well. They can provide us with the best healthcare in history. But there are of course huge risks. What happens if you can no longer understand why the bank refused to give you a loan? Why the government or the army did this or did that? When power shifts from organic humans to these alien inorganic AIs, it just becomes more and more difficult for us to understand the decisions that shape our life. The biggest misconception about information is that information is truth. Most information is not truth. The truth is a very rare and costly and expensive type of information. Let's think about images and portraits. Over the last 2000 years, people have created billions and billions of portraits of Jesus. And not a single one of them is an authentic depiction because we have no idea how Jesus looked like. There is not a single portrait made during his lifetime. And the Bible doesn't say a single word about how he looked like. It's very easy to create fictional information because you don't need to research anything. You don't need evidence. You just come up with something and draw it. If you want to paint a truthful picture of anything of a person, of an economy, of a war, you need to invest a lot of time and effort and money to research to make sure that you get it right. If we just flood the world with information and expect the truth to float up, it will not, it will sink. So to deal with the era of AI, it should be clear that we cannot anticipate how this technology will develop over the next few decades. What we need is living institutions staffed by the best human talent and with access to the best technology and will be able to identify and, and react to dangers and threats as they arise. So I'm not talking about rigid regulation in advance, I'm talking about the need for new institutions because you can never rely on just, you know, the letter of the law or on a charismatic individual, some genius to do it. In history, humans again and again encounter these problems and it always goes back to the same solution, institutions.

- The conference of the United Nations on international organization is now convened.

- Good institutions, they are characterized by having strong self-correcting mechanisms. A self-correcting mechanism is a mechanism that allows an entity to identify and correct its own mistakes. You don't have to rely on the environment, on something out there to correct your mistakes. You can correct your own mistakes. This is a basic feature of any functioning organism. Like how does a child learns to walk? Mostly it's self-correction. This goes all the way to entire countries. This is the the heart of democratic systems. What are elections? Elections are a self-correcting mechanism. You give power to a certain party or individual, let's try your policies. After some time, if you think you made a mistake, this was the wrong policy, this was the wrong party, there are another round of elections, let's try something else this time. The whole meaning for democracy is that you have large numbers of people conversing about the issues of the day. And it is no coincidence that the democratic conversation is breaking down all over the world because the algorithms are hijacking it. We have the most sophisticated information technology in history and we are losing the ability to talk with each other to hold a reasoned conversation. In order to protect the conversation between people, we need to ban bots from the conversation. We need to ban fake humans. AIs should be welcome to talk with us only if they identify as AIs. And as individuals, my best recommendation is to go on an information diet, the same way that people go on food diets. Information is the food of the mind. More information isn't always good for you. It's actually good to, from time to time, take time for information fasts. When we don't put anything more in, we just digest and detoxify. And similarly, we should watch the quality of the information we feed our mind. If we feed our mind with all this junk information full of greed and hate and fear, we will have sick minds.


Related