https://youtubetranscript.com/?v=eWliAo47ggo

So Sarah Mary asked, Hi Jonathan, my 15 year old daughter recently commented that AI could never conceivably take over the world or even just a small society as some people are apparently discussing. Her reason was that creation cannot be greater than its creator. She also said I bet Jonathan Pechot would have something insightful to say about this. So I told her I would post it here. In the case of God and his creation, this is certainly true. He’s greater than and above his creation. Is a creator always above a creation? Can humans even really create anything? What are your thoughts? Thank you. So the problem with AI is not that it would take over the world. It’s not just that it would take over the world in a kind of matrix or Skynet way. It would take over the world using us. It would take over the world with our own acceptance. That’s what would happen. That’s already what’s happening. And it’s not only that it’s a human creation, you kind of have to understand the way that it’s like a body for a principality. And so it’s more like that. So you could understand that. Think about the way that ancient gods would take over places, for example. And so you have a city. That city now starts to worship a god. And you would ask yourself, well, did the god take over the city? The god doesn’t have a body. The god doesn’t come in there. It doesn’t appear with a sword and starts killing people. It’s a principality. We are its body. So it’s actually the followers of that god through whatever means would bring about the reign of that god in that city. And then the people would become arms and legs or aspects of the body of that god. And I think that that’s what we’re seeing with the question of AI. It’s not just that AI is going to take over. It’s that we’re going to want it to take over. People will worship what it is. In some ways, like I said, it’s already happening. And that’s how it will take over. And it’ll appear more like a direction or like a direction with a lot of power. So for example, like an anti-human environmental agenda run by AI would not take over. People would be willing, like people right now, you see what’s going on. People are accepting that there’s a famine in Europe, that there’ll be a famine, that people won’t have enough to heat, won’t have enough energy in Europe. And they’re doing it for the greater good, for an environmental reason. But no one is forcing it. It’s like it’s happening. So George says, so AI taking over someone like having pride flags planted everywhere in your city. Exactly. That’s exactly how it takes over. It doesn’t. So it’s like AI takes over by making you dependent on this. And once you’re dependent on it, it doesn’t force itself on you. You want it because what are you going to do? Like, how are you going to know how to get to somewhere without your Google Maps? Like, where are your files if they’re not on Google Drive? So that’s how AI takes over. It doesn’t do it. Ultimately, it can enforce itself later with weapons and violence. But at the outset, it takes over that way.