0:00
/
0:00
Transcript

https://discord.com/invite/EGFRjwwS92 In this episode, we explore a unique and compelling argument against atheism that contrasts with traditional Christian approaches. Delving into future scenarios of humanity's extinction, stagnation, and advancement, the discussion investigates the likelihood that humanity's future advancements may lead to god-like entities emerging. Through examining the implications of AI and genetic engineering, and the moral obligations we face today, this conversation challenges atheists to reconsider their stance and embrace a consequentialist perspective. The video also touches on the importance of resilience and pragmatic decision-making in the face of life's challenges.

Malcolm Collins: [00:00:00] Hello, Simone! Today, we are going to do an interesting episode. I've been getting a little sick of just doing politics all the time. So we are going to do a novel argument against atheism. Oh, yes. Screw atheism! And this is an argument I had never heard before and I probably would have found compelling as a young atheist when contrasted with the arguments that were actually delivered against me.

I'm getting at it from such a novel direction that would have leaned into my presumptions as an atheist around logic.

Simone Collins: Oh no! Oh yeah, because you have to, you can't, I think the problem with a lot of Christian influencers, both in like the early atheist internet and even now, are only speaking in like Christian terms.

Like they're not, they're not, they're not getting to the other side and getting in the mind of the atheist who is being hyper rational. Instead, they're like They're literally, I don't know if you know about this, but on TikTok Christian influencers are like, don't scroll the devil wants you to [00:01:00] scroll and they're also like, they'll turn on their phone and then they'll like banish demons, but they're using like the same kind of language that that would be used if you're like, telling your cat not to be on the table, like, hey, get out, get out, get, get, and it's like, No, but it's not going to convince nonbelievers.

You're not using any terms that are going to work for them.

Malcolm Collins: The arguments that I heard against atheism when I was younger or for, or for theological framings when I was younger were predominantly like one of like four arguments. Okay. And so I had a pre Established are a counterargument every time I was given one of them.

Your auto response was not just auto response. It's obviously I had thought through each of them a lot before advancement on an argument. I've already heard now. This is also an interesting thing about this particular argument. It's ACS.

Simone Collins: There's no counter argument? radical

Malcolm Collins: to most religious people. Oh, that's good.

Oh gosh, I'm so intrigued. Only would think of, only an [00:02:00] atheist would think of this argument. What? But then they'd become religious, so they'd be not an atheist anymore, definitionally, but they wouldn't be a standard religious person. But then as a

Simone Collins: religious person, they would still find the argument repugnant?

What is the argument?

Malcolm Collins: Okay, okay, so we'll go into it.

Simone Collins: Oh.

Malcolm Collins: So one of three potential futures exists. And only one of three potential futures exists, really. Okay.

Simone Collins: Okay.

Malcolm Collins: In future number one, humanity and our descendants die out. We, we go extinct, then the universe ends as far as we understand physics right now.

Sure. Okay, possibility number two. So this is universe type two humanity stagnates the universe ends up ending and our existence was largely pointless because we just stagnate. We never really develop in any meaningful context. Yeah, just

Simone Collins: sort of, a big bang to entropy. Meh.

Malcolm Collins: Yes. Yeah. Possibility number three.

Humanity and humanity's children, i. e. the things that we develop, end up continuing to grow, [00:03:00] improve, and evolve.

Simone Collins: Endless complexity. Beautiful pattern.

Malcolm Collins: complexity. We don't know exactly what happens in this reality in this future because we don't know if it relates to physics the way we relate to physics.

I know that if humanity stagnates, they continue to relate to physics the way we relate. But if humanity continues to advance, are they able to create parallel dimensions? Are they able to travel between realities? Are they able to start new dimensions? Are they able to restructure? We don't know what this would be like.

All we know is that has continued advancement. And the way I say there's the only three potential futures is because they sort of cover any possible reality I can think of. Either death, stagnation, or advancement. Even if you have advancement tainted by a lot of stagnation, it's still advancement.

Simone Collins: Yes. Right, so. That's often what advancement looks like. You're going to

Malcolm Collins: fall into one of these scenarios. Yeah. Now, if we live in a [00:04:00] timeline. Type one or two, i. e. a timeline where humanity all is extinguished or a timeline where humanity is stagnant. Our decisions today don't matter to the future. Sure.

Okay. Yeah, obviously. With that being the case now now a person could be like, oh, but timelines aren't set our decisions may influence Which of these timelines we are in right and i'm like, okay, I will buy that possibility If that is possible Then all of our actions today Should be made with one of two optimizations in mind to try to nudge us forward To a timeline type three and to try to influence for positive outcomes where positive outcomes can mean whatever you think they want to mean.

We'll talk about this in a second within a timeline type three.

Simone Collins: Okay.

Malcolm Collins: Basically, you never need to make a [00:05:00] decision. Like, like if I'm saying, should I do X or Y? If I am assuming that it is faded, that I am in a timeline type one or two, a timeline where humanity goes extinct or a timeline where humanity stagnates because my decision doesn't impact the future now I E I guess this is a different way you can think of it.

If somebody tells you that You know, you're in a room by yourself, right? Like you could do anything and somebody's like an asteroid is going to come and destroy Earth tomorrow. There's nothing you can do about it at this point. You do not get moral points for giving to charity. You do not get moral points for going out and I mean, you might get like some sort of marginal moral point from like going out and being nice to people, but probably not really.

Your actions just don't matter from that point forwards that much from a moral weight perspective. You, especially if you take out any action that doesn't affect anyone else. , and I'll explain this a bit [00:06:00] differently, so we'll use a train analogy here.

Suppose you know that three trains are heading for a cliff. One train is red, one train is blue, one train is yellow. Yellow. The interior of every one of these trains is set up the same and looks the same. And in the middle of each of the trains is a button. You know that in yellow trains, this button stops the train and it doesn't end up going off the cliff.

Simone Collins: Okay.

Malcolm Collins: Okay? What do you do? If you don't know which train you're in, you act assuming you are in a yellow train. Yeah, you gotta push the button. Yeah. Your metaphysical framework of reality presumes the yellow train because that is the only reality in which your future matters. Now, suppose And this is where it's like, okay, what if we're not in set timelines and we are instead able to influence the future heavily through our decisions, like, able to influence, when I say influence the future, I mean influence which one of these timelines we happen to be in.

Suppose it turns out that, like, [00:07:00] the amount you pray makes it more probable you're in a yellow train. So then you do two things. You both hit the button and spend the entire time praying that you're in a yellow train. Even if it's only a marginal shift, you're still going to do that. So your choice of actions is clear regardless which train you are actually in.

Okay. Okay. Now we're going to go to the second part of the argument here.

Simone Collins: So

Malcolm Collins: basically the first presumption is, is you should always presume in terms of your actions, your thoughts, and your metaphysical understanding of larger reality unless you have proof otherwise that we, that it is at least possible for you and your actions to push humanity into a timeline where we and our descendants continue to improve, change and advance whatever that means.

It doesn't mean anything specific, but just this continued improvement timeline. Next with advancements in things like AI and genetic engineering, if humanity and our [00:08:00] descendants continue to advance, if I think like, where is humans? Like, what is a human going to be like 50 years from now? Like even my kids, kids, kids, kids, kids, they're almost certainly going to be some degree of genetically modified.

They're going to be some degree of Interconnected with AI and interacting with AI. German

Simone Collins: screen editing is already legal in South Africa. I mean, this is, we're here.

Malcolm Collins: Yeah, so, I expect, and people can be like, yeah, but a lot of cultures won't accept this type of advancement of humanity. And it's like, the problem is, is that the cultures that do accept it will be able to beat them.

Simone Collins: Yeah.

Malcolm Collins: If one society, Refuses to continue advancing A. I. Refuses genetic augmentation of humans and it tries to impose its will on a society that is using this technology that's going to be very, very hard, especially after a few generations of using this technology.

Simone Collins: Yes.

Malcolm Collins: So what this [00:09:00] means is that in 200, 300 years, humans are probably going to look very different, think very differently than humans do today.

The question now would be, okay, what if we're not talking hundreds of years, we're talking thousands, tens of thousands, millions of years.

Simone Collins: Okay.

Malcolm Collins: What does humanity look like millions of years from now? And to an atheist who believes you're in a timeline where their decisions end up actually mattering to the future i.

e. one in which we continue to advance I think it is very hard and it begins to just look like implausible and require, like, weird sci fi stuff to happen. Like, you basically need to add MacGuffins to propose a future in which humanity a million years from now isn't closer to what the religious call a god than what you would call a human.

For sure. And when I talk about what the religious call a god, they're like, come on, [00:10:00] that won't be true. Okay, let's talk about, like, most gods and, like, the dominant religious traditions today, right? What are some of the things that these entities can do? Well There are

Simone Collins: already largely things we can do.

Malcolm Collins: There are already largely things, but I'm thinking like just like 100 years from now. Okay, they are entities that you can beseech. They can listen to your prayers and they can augment probabilities of future outcomes. Well, an AI lattice around Earth might be able to do that in 100 years, in 200 years.

Like, that's, that's a near term thing. They're like, oh, well, they're entities that can give you an afterlife. We don't seem to be that far from being able to, I mean, I think, At the very latest 500 years, being able to digitize humans into either positive or negatively coded in terms of qualia afterlives.

So we're pretty close. Well, that was a

Simone Collins: major plot point of Surface Detail the inbanks. Film that people's consciousness is essentially we're being digitized and then government for basically saying and using as like a character stick thing. Like, we will put [00:11:00] your consciousness and in a digital hell.

That is worse than anything you could ever imagine. If you are. Anything from, like, an actually bad person in life to a political dissident. That that is. and that is entirely, I mean, po And of course you might decide that your consciousness sort of ends with your biological being, but like, again, this, this is getting a little, you know,

Malcolm Collins: that's very beyond.

Yes. I don't think that many atheists would think that I think they think like, well, if it's Exactly. If it, yeah, then it's

Simone Collins: just a copy of like the way that I think. So I don't care. Again, we're

Malcolm Collins: not, we're not talking, this isn't an argument for religious people. Uhhuh, this is an argument for atheists.

Yeah. Yes. So if we're saying that we can emulate most of the abilities of, of, of. the mainstream gods, not in a million years, but in like 300 years to to say that we wouldn't be in every meaningful sense like gods in 10 million years. And when I say like gods, it's like, where's the functional difference here?

It's very similar where what's [00:12:00] the functional difference between the emergent properties of your brain and a soul? If in the past when people use the word soul, they meant all of that thinking stuff. And now we can just describe how that thinking stuff works. Are we talking about something meaningfully different than what they were talking about?

No, the only way it's different is when individuals today are like, well, I want to add some additional properties that we can't account for with evidence or with predictability. And those are the things I'm calling a skull, a soul. Basically, you're creating a soul of the gaps. You, the atheist now are creating a God of the gaps where you were saying, well, even though this entity or these entities may.

Functionally fit all of the definitional requirements of a God. I am defining them as not a God because they are not supernatural. And it's like, well, then you now are subject to the very fallacy you accuse CS of being subject to.

Simone Collins: Yeah.

Malcolm Collins: Here's where they'll get me. They'll say, wait, wait, [00:13:00] wait, wait. You said a God, not a society of gods. Why would I presume a God exists in the future? Right. Instead of like a whole civilization of gods. Well, here is a truism of human technological development. So far, as humans technologically develop, our thought processes are in humanity more broadly becomes more interconnected.

Specifically, we have, and this is like all of the big technological breakthroughs. You know, you get better navigation with boats and you can travel more and you get a more sophisticated society or Roman road systems created more interconnectedness. Or early books in writing allowed us to communicate across generations.

You get the printing press, which led to a big jump you get. And a lot of people don't realize how big this was trains. It used to be, it took days to get to any other city or weeks to get to another city. And now you could go for a pittance overnight. And this was

Simone Collins: such a big deal for people that were like.

Is it, is it safe for a [00:14:00] human body to travel at such speeds? Like they weren't sure they were really not sure. Well,

Malcolm Collins: societally, we were unsure of what effects this would have. Like what happens when it's no longer a multi week journey from you forget from one city for another city, you know, what have ideas traveled too fast?

But now, you know, we have a roads and interstates and airplanes and then what have been the major technological developments recently? Oh, things like cell phones. Well, the telegraph. Phones, cell phones, the Internet. And I think that in a way, AI is the accumulation of this speed of communication, probably about equivalent to the invention of writing.

Yeah, probably bigger than printing press. I'd say AI is about equivalent to the invention of the first human writing in terms of human communication.

Simone Collins: I mean, it's

Malcolm Collins: a new way to store and transmit human created knowledge. Here's the problem. If we continue to develop this way, if you, instead of presuming that humanity is going to go in a direction that humanity has never gone historically, towards [00:15:00] less connectedness, if we continue to move towards more connectedness, and I'm going to assume an interplanetary level now, now this becomes a bit more interesting if you're talking about multiple planets, but within a planet, that almost certainly means that by the time of my grandchildren, I am almost certain there will be a Internet.

I call it the Omega Net, an Internet of consciousnesses that will exist within this planet and and and at least within 500 years. I mean, we're already looking if you look at brain computer interface technology today, we can already create direct connections between people's brains where I can put a brain computer interface chip in one person's brain and they can connect.

Communicate directly with another person's brain. However the communication is very rudimentary at this point It's it's coming from like a central point in one person's brain to a central point in another person's brain it is not effective. It is not easier than voice to voice communication but do I expect it to eventually be faster than voice to voice communication?

I think you need to argue for mcguffins [00:16:00] like spice or you need to argue for like, what's the, the, whatever jihad from doing where like no technological development is allowed anymore. Or no, but

Simone Collins: Larry and Jihad,

Malcolm Collins: yeah, the Butlerian Jihad, or you need to argue, you basically need to argue like some big change happens in civilizational history that prevents this from coming to exist.

And I, I don't see any. I importance of that at all right now. So what it is. I don't know.

Simone Collins: I understand why this wasn't an argument in 1990s atheist internet. And that's because we had no idea that AI was going to have this Cambrian explosion that we're seeing right now. This changes everything.

Like our timeline is not what I expected it to be. Even with Ray Kurzweil being like, the singularity is near. I'm like, eh, keep crying mister. Like it's a nice dream. But oh my gosh, you know, yeah, so what I mean

Malcolm Collins: this argument is so coming from an atheist perspective of a csu You can get where i'm going with this, but you know, it's a [00:17:00] theist is never going to come up with this argument

Simone Collins: Yeah, because it is based on an unexpected and sudden technological shift in trajectories And not based on like the like Well, everyone knows that there was always a God like the, the, the, the typical Christian arguments that I see online are like trying to show evidence of there having always been this ancient God and like, look back to the history and look back to what Jesus says and look back to the prophets and the saints and miracles.

And that's just not going to get an atheist. You're right. But atheists are. Oh, they're able to think through the knock on effects of a technological revolution and see where the technical technological revolution has us today, which is also

Malcolm Collins: requires them to make big logical jumps to defend atheism at this point, because here you're, you're, you're having to argue for completely implausible things for an atheist position to be the most logical position.

By the time I get to the end of this argument, like you need to argue for a decrease in human interconnectivity. Yeah, Sevenly,

Simone Collins: you are the last. [00:18:00] It's atheists of the atheists. You are

Malcolm Collins: now assuming fantasy. Yeah, how dare you have faith?

Simone Collins: So, The best arguments are those that hoist people by their own petards, right?

Like you want that. Yes.

Malcolm Collins: So interconnectedness continues to increase. And I suspect it will even increase to the point where essentially all subprocesses was in our brains are directly connected to the omega net and we lose some degree of individuality. Because it's just the way that we process and interact with reality.

Now, I don't mean this in a way that we permanently, like whatever humanity becomes, permanently loses the capability for individuality. I mean it in the way that you go on the internet in the morning, and you can leave the internet at night. Which is to say that the individual parts of your Personality or brain may prefer being parts of these different larger holes rather than yourself within these environments.

And I suspect people will be able to separate from these [00:19:00] environments at will. However, these types of separations may not even meaningfully understand a person as an individual. It might be collections that separate. It may even be connections of collections of certain sub processes within your brain, within like I have a subprocessing within my brain that's dedicated to like one thing and like a huge collection of these has formed an identity as an entity across like 10, 000 people's brains and one day it decides it wants to act as an individual and so it prints a body and And it goes out acting like an individual.

You know, what individualism means will likely be something very different to OmegaNet entities than they are to modern humans. Now, because of that, it would be, at least if we're talking on a planetary scale, meaningless to call something a collection of gods. It is meaningfully a single god. And then people can be like, oh, come on.

So you're saying [00:20:00] that There's inevitable God and this God is like a collection of entities and like a singular entity, but also like individual man. But they're all of those things simultaneously like, religion never predicted a trinity like that or something. That, that would be absurd.

And I'm like, oh, okay, yeah, the trinity. That's exactly a very good analogy for what, how we should understand the future entities plurality. Which is to say, it is both plural and singular and to try to pin it down as either well, in a Christian context, they would have called the sacrilege in my context.

I just say it is illogical. It's a misunderstanding of the entity. So now it goes further. So then the individual says, okay, okay. Okay. I buy this. Well, first, let's talk about multi planetary. Systems. You might end up with different Omega [00:21:00] nets for different planets, or we might get faster than light travel in reality folding, in which case you wouldn't get that.

I suspect we will, but it just depends. Regardless within a planetary or you might even not have people living on different planets, which you might have as they build out from a single planet, because it turns out that we can create matter or giant floating cities in space. Or like, that's just the easier way to spread.

I don't know. But there's lots of ways that this could turn out, and we just don't know at this point. But, but that god like entities will exist, when I say god like, I mean indistinguishable from God, except for the fact that they actually exist, is an inevitability at this point, to an atheist.

That thinks that a good timeline is possible. Now, before I get to anything else, anything you want to say to this?

Simone Collins: No, but I am hoping for the culture as described by Iain Banks. Oh, that's what I'm going for here. Fingers crossed. [00:22:00] Basically, each AI, each ship is kind of like a god like entity in its ability to do things.

And things splinter and there's, there's difference. There's not like one giant borg, which would be boring. Well, you

Malcolm Collins: think it would be boring because you would see it as more uniform, and I assume it's probably not that uniform, it's probably quite well, we'll get to that in a, Who

Simone Collins: knows? Yeah, I don't

Malcolm Collins: think we can conceive of it, I think it's, No, definitely, this is

Simone Collins: beyond us, this is totally beyond us.

Now, you know, a Christian God, a Muslim God, a whatever God even like Buddhist and other conceptions of God there, you're not supposed to be able to understand it in any way.

Malcolm Collins: So now the Atheist says, okay, okay, okay, this entity will come to exist, but why should I act like a God exists today? This is an entity that's going to exist in the future in any timeline where my actions and decisions matter.[00:23:00]

And here's the problem. If this entity comes to exist and this entity is and you can try to presume what its morals will be, but its morals will almost certainly be above what your morals are because it will have better self control than you have in better processing than you have. Will it have the capacity?

To pass a judgment on you and treat you in the way a God would based on that judgment. Will it have the capacity to recreate human civilization in an AI will it in an AI lattice? Will it have the capacity to infinitely punish anyone? It believes deserving of infinite punishment. Will it have the capacity to infinitely reward anyone it believes should have infinite rewards?

Of course, it will. Are we getting

Simone Collins: into you thinking your life is a simulation right now?

Malcolm Collins: Well, I've mentioned this before, which is, I, I think it's already implausible, a little implausible that someone and I don't live in a simulation because our lives are too good. Whoever was creating the simulation that was supposed [00:24:00] to reward me for whatever I did in a past life.

If not believable, I'm one of those people who keeps waking up in the matrix.

Speaker 7: Did you know that the first Matrix was designed to be a perfect human world, it was a disaster. No one would accept the program entire crops were lost.

Which is why the Matrix was redesigned to , the peak of civilization. And I say your civilization, because as soon as we started thinking for you, it became our civilization,

Malcolm Collins: fixed. I'm like, nope, too good. And I'm like, screw it. I want the steak. I don't want to wake up. Don't wake me up. This is good. Perfect challenge level. Perfect. I love all of this. This is the best. There is, there is not a single living human I would switch my life with.

People are like, oh, would you switch with Elon? Not a chance.

Simone Collins: I wouldn't switch with Elon. I mean, well, I guess he would keep it because he knows that he has this amazing influence and impact, but it's not like he has chosen the happiest path by any stretch of the imagination and he made that very clear.

Yeah, would I switch

Malcolm Collins: with Trump? No. No. Who would [00:25:00] I switch with? Nobody.

Simone Collins: I, I like, I really can't imagine anyone else either. This is, this is perfect. Yeah,

Malcolm Collins: would you switch with anyone? Is there any living woman or man?

Simone Collins: No, or living or dead. No, no.

Malcolm Collins: Yeah which to me is implausible. Why, why do I have the best life I can imagine?

That seems like it's being given to me as a reward for something I did in this timeline.

Simone Collins: I wonder if most people living in alignment with their values Feel this way? Feel that way. Yeah. Maybe. I see that more. I see more hints of that in people who are, they know what they believe and why, and their entire life is properly oriented around that.

Then yeah, those people seem like they wouldn't switch places with anyone, no matter what they've been dealt. But I don't know. Shout out in the comments, I guess. Let us know if you feel like you are also in a simulation and or if you would switch your life with anyone. Oh, and if you would switch your life with someone, who would it [00:26:00] be?

Like famous public figures. I'm really curious to know. Who people actually want to be. Because I watch a lot of online commentary, and there's not a lot of like, oh I wish I were them. Like some people are like, well I wish I had that person's nose, or their clothing, or their wealth, or their house. But never like, they want to be them.

Because there's also this, it's very common to like, criticize people and see their error. So, I'm curious what people would say. Yeah, so,

Malcolm Collins: an individual here might be like, okay, me, atheist, okay, I agree with you. In timelines where my decisions matter God or a God does eventually come to exist. But that I would be beneath that God, like my decisions wouldn't matter to that God, right?

So why, why do I need to take the time to like cognitively deal with the fact that I live in a reality where a God comes to exist? And then the, the problem here is, is no, what you are now saying is, I believe in a God, I just think, I believe I am subject to the whims of a God, like a God could [00:27:00] conceivably reward or punish me, I just don't think that God that exists, that's coming to exist, Cares about me and I'm like fine But you still believe in a god now you still believe in a god Could punish or reward you and will come into existence You just are the person who today believes in a god and says yeah But it's a god of a clockwork universe and isn't paying attention to me.

So now you are a theist Functionally speaking. You just don't think god cares about you. So an individual now is like, Okay, okay, okay, I guess I'm functionally a theist, and that I believe that I should be making decisions with the assumption that we live in a timeline where a god eventually will come to exist, who would have the capacity to pass judgment on me or I would but probably won't.

Right? Because my decisions don't matter, or that's not the way a God would act or something like that. And, and now I'm like, okay, well now we have to deal with a few other [00:28:00] logical problems with your position now that you've accepted this plausibility. So you believe that this super intelligent entity exists in the future, this super powerful, basically omniscient entity.

Or about as omniscient as a Christian or Jewish God in most contextualizations. Because they're often not that omniscient. They're like, kind of omniscient, but like, bordering. This entity would probably be more omniscient than they're portrayed in the Bible. And you can go over our whole techno Puritan thing, where we actually argue that this entity probably does exist, and is the entity that's being described in the Bible.

And can project itself backwards in time. Because time doesn't, it doesn't relate to the physical constraints of our universe in the way that we do because it's so super powerful.

Will note here, if you are understanding of what it's in the biblical text, isn't from actually reading them and from people telling you what's in them, this is going to sound a lot crazier than if you have read them. For example, a little Daniel, 2 44 says pretty explicitly that the kingdom of God is in the future. , it is something that will come to exist at a future date. Or in Jewish [00:29:00] eschatology, you can talk about Allah.

Malcolm Collins: which translates to the world to come, which is how they talk about God's kingdom, the world to come. Or within Christian eschatology,

When they talk about the resurrection of people. , in the future, in the kingdom of God, does that not sound a lot? Like people being brought back in a simulation? , which could in many ways be more real than our existing world. I mean, people are like, oh, well, a simulation isn't real. Because, It is.

Operating on the background code of this simulation.

Like those are the rules that is driving it and I'm like, yeah, well then how is that any different from the physical laws in our reality, are those not the background code of our simulation? , and. Lines from the Bible, actually make a lot more sense in this context. Then, if you assume what a lot of traditional Christians assume, which is that our literal bodies are raised, but then transformed to be supernatural.

So here's an example, Corinthians [00:30:00] 1549 to 52. Just as we have born at the image of the man of dust, we shall bear the image of the man of heaven. I tell you, this brothers, flesh and blood cannot inherit the kingdom of God, nor does the perishable inherit the imperishable. Behold. I tell you a mystery.

We shall not all sleep, but we shall all be changed in a moment in the twinkling of an eye at the last trumpet for the trumpet will sound and the dead will be raised imperishable and we shall be changed.

So this is describing the difference between the man of dust I E who we are today. And the man of heaven, who we will be within the kingdom of God. , to me, it just makes a lot more sense to interpret this as some form of simulation.

but here's the thing, you could say, okay, I could say that plausibly such an entity, like, it's definitely going to come to exist.

Plausibly it doesn't relate to time the way we do. Right? And now here's the problem. Why? Like, what's your argument [00:31:00] anymore? Now that it is probable that such an entity will come to exist, and I think plausible that it's going to relate to time differently than you relate to time, now what's your excuse for choosing not to believe it exists If it causes you psychological and moral harm to choose to believe it doesn't exist So let me explain what I mean by this first You as a human co evolved, you know, you're an atheist you believe this co evolved with religion if you rip out religion if you rip out the culture that human brains evolved alongside You are ripping out a huge chunk of what makes you human.

So You Lots of negative consequences are going to come from that. And now that it's not even logical to do that, why are you doing that? And you can see this from the data. People who don't believe in a God have higher rates of depression, higher rates of anxiety, higher rates of unhappiness, [00:32:00] higher rates of practically every negative higher rates of mental health issues.

But also just think about how it changes your moral equation, right? Why would you. not promote this belief among yourself and among your kids. If it led you, even as an ACS who believes in morality, to be more likely to make moral decisions. Because you can say, Oh, well, no, the decision's more moral because I believe that I am making this decision without any possible punishment or without any possible benefit.

And it's like, yeah, except, As an atheist, you believe that a moral decision that is motivated by your own actions in terms of the consequences that it has on other people is equal to a moral decision that is in part motivated by the belief in future rewards or punishments, right? Because you're going to be a consequentialist if you're an atheist of this category.

Well, if you, can you really argue that if you're making a decision because [00:33:00] of your own morals, And because of additional expected benefit that you're not going to be even more likely to make the more moral decision. So, assuming that this entity exists leads you to, from a consequentialist perspective, make the harder choices more.

And then a person could say, well, does it really cause me to make the moral choices more? And here is what I'd ask them. I'd say, would you be like a big, hard moral choice, like having lots of kids? Would you be more likely to have lots of kids if you believed 100 percent this entity existed?

Like if you just chose that belief, and I think a lot of people like, oh, I can actually see how this helps with the big moral choices. And I'm going to say at the societal level for people who are maybe less intelligent than you, does it not help to promote the belief in this entity if you're trying to create mass moral action in society or society that makes positive moral choices or children that make positive moral choices?

They'd be like, okay, yes. And so it's like, okay, so now that such an entity is both baited [00:34:00] in timelines where your decisions may get matter and probable. Why are you denying it if it leads to moral choices? And now is the final thing when it comes to individual choices. It leads to more pragmatic decision making and less emotional decision making.

So if you believe that an entity is steering the events of your life and that when something negative happens to you, it's happening as a message to you and potentially a punishment to do better when something good happens to you is happening as a reward for you So you can be like, oh, this leads to really horrible things if a bunch of negative stuff happens to you in life, right?

If you believe that negative stuff is meant to teach you something, right? And it's like, not really. Okay, so suppose two scenarios, right? Suppose in scenario one, You are pure atheist and something absolutely horrifying happens. Like, it could be something small, like you lose your job or it could be something big, [00:35:00] like your spouse dies, like Simone dies on me, right?

Like absolutely horrifying. I have two emotional reactions I can have to that. Reaction one presumes that this future entity doesn't exist. It's like, this is just a horrible thing that's happened for absolutely no reason. And so I am sitting down and I am just like, Oh my God, I cannot believe, like, why is reality so unfair?

Why is my life so unfair? There is no positive to this. There is nothing good that can come from this. I like, I just need to deal with this. Right? Which leads to emotional spiraling. Now, scenario 2. I'm like, Simone was fated to die because it turns out that it is what God needed to happen. While it is emotionally challenging for me in the moment, I am now supposed to look at what I am supposed to do with the opportunities that are now possible for me because Simone is dead.

This means looking [00:36:00] for the next potential wife. This means looking for potential business opportunities that are possible for me now that I'm working alone, potentially different ways of parenting, potentially, basically I'm saying is I am supposed to learn and adapt from this and who I become after this adaptation is going to be a better person.

Than who I was before this, the way I am emotionally going to relate to that tragedy is going to be infinitely better, like one involve less pain, but to involve better decisions because I'm going to immediately get back out there, be trying to find the next mother for my kids who can be and people can be like, oh, this wouldn't work.

Like, you're still going to be emotionally distraught. Simone, you were with me when my mom died, right? Like that was when we made the channel. Did I ever, and this was after I developed this belief in this God. Did I ever like emotion? There was

Simone Collins: no room for it. We were hosting that day. We were away from home juggling like a set of meetings, interviews and events.

All in that day. And then [00:37:00] it just sort of things kept happening after that. You just had no room. No room for it.

Malcolm Collins: No room. There was no reason. I had no room because when I was evaluating between what matters was my life should taking time to mourn matter or rather taking time to indulge in the mourning ritual and negative emotional states that we associate with mourning will that lead to more positive outcomes for the future of humanity or will just continuing to push forwards lead to more positive outcomes and just push forwards every time.

Just push forwards. Just push forward. Just push forward. So that's what I did. And I never experienced those negative emotions. And even now when I reflect on it like they're like, oh, well, if like Simone died, this would trivialize her life. And I'm like, how would it trivialize her life? I'm able to now say I really am grateful to you for all of the moments that I got to spend with Simone I am really grateful for the good life that Simone had and I believe Simone has had a good life I would not have any Feeling of like [00:38:00] I wish I had been a better husband to her because I think i've acted as good as I can act as a husband

Simone Collins: Well, I think that's the other underrated element of the grieving process which we've talked about in another podcast, which is that You We think a big element of the grieving process is a result of not being satisfied with how you manage that relationship.

And I, I think that a big reason why you would not grieve if I died is that you knew that you were the best husband possible to me, that you always did your best to give me the best possible life. You made all my dreams come true and then some and, and that you, you were always loving and caring

Speaker 12: The.

Simone Collins: And that the most important thing, and you would also know like the only way that you could honestly honor me if I died is by making sure that our kids.

have the best possible outcome and that our kids flourish. And with, when your mom passed, you knew that, you know, when she called you always picked up. When she wanted to hang out, you were always there to be with her. You, you let her into your life. Often like at great personal [00:39:00] cost to you, whenever she wanted to be involved.

And that, you know, you just sort of had no regret that, like, you didn't invest time with her, or you didn't listen to her, or you didn't let her into your life. I was mean to

Malcolm Collins: her when I shouldn't have been. Yeah, like, no,

Simone Collins: you, you, you always did what you thought was best for her, and you always thought, you know, like, You know, am I doing what I should do?

There was no cognitive dissonance so that there wasn't like regret after she died. Like I should have spent more time with her. I should have been more kind to her. I should have given her more access or time with our kids. Cause you did all that. And that's, I think something really important for people to think about.

I, I, I've just been like exposed to so many tragic stories recently that now I'm constantly thinking about like, this could be the last day I have with anyone. With anyone. And so it's, I think it's really important for us to remember that because it happens all the time. I mean, you never expect someone to die, except for in rare.

And I guess I mean, like, if someone's chronically ill or on hospice care, you kind of expect it, but like, normally you don't. And you have to really be [00:40:00] prepared like every day to enjoy what you have and make the most of it. And that's, I hate that, but also it makes you aligned with the way you should be.

And then when it actually does happen, it's not going to be as damaging.

Malcolm Collins: So this presumption also leads to better decisions and more pragmatic, less emotionally infused decisions around everyday life occurrences and contextualization. So suppose you're a kid and you're being picked on in school, right?

And you're deciding. How should you feel about this? Do you care about the judgment of your classmates when your goal is to be of service to the God that will eventually come to exist? When your goal is the advancement of humanity? Or okay, let's suppose you're like, okay, I need to find a wife. And so you have to go out and you've got to date a lot of people, right?

So you go out and you, Go up to somebody in a bar or you go up to someone online and you get rejected now in normal world. You get rejected. And your response to [00:41:00] getting rejected is to feel. Oh my God, social rejection in this world. It's, oh, how am I supposed to react given my goal that's assigned to me by this, the higher entity, which is improvement and, and, and protection of humanity.

Right. And so you're not as emotionally concerned about that because you are just playing for a bigger timescale. You are playing for bigger stakes. And the final thing here is a moral alignment because the God in this framing. Definitionally implies morality as the protection and continued improvement of the human condition so that it can reach a state of understanding which is beyond our own in terms of what is moral and what is immoral.

You don't need to worry about so long as you agree that this is a fairly good way to determine morality. You don't need to worry about sort of ancestral moral systems messing with this. You don't need to worry about any of this other stuff messing with this. You have a [00:42:00] high degree of moral alignment and moral predictability.

In the belief in this God and in the belief that anyone else had was in this God. And if you find this interesting, you should check out the track series or techno puritan. com where we host all the information on this. But what do you think? Well, what would be your key arguments against this?

Simone Collins: I don't, this is one of those things where I just don't think that this comes down to someone having a logical counter argument. I think it comes down to someone just not really caring and not wanting to necessarily live a morally good life or just being too lazy to expend the mental effort to think through this and then change their actions accordingly.

Malcolm Collins: Yeah, it reminds me of the, the nihilists. And I think that there's the, the, the, the classic interaction I'll have with a nihilist and they'll be like, what if, you know, even in this scenario, does anything really matter? Right. And I'm like, Oh, you can just assume the, in terms of the trains that are different colors.

I just always assume I'm not in [00:43:00] a train where nothing matters. Then they're like, why would you make that assumption? And I'm like, because none of my decisions matter in a reality where nothing matters. So how can you presume But. You know immediately that I am not in such a reality. I don't even need to consider that I might be in reality Because in those realities the ways I answered this question is irrelevant

Simone Collins: Yeah, and so in the 0.

0001 chance that your actions do matter then like it's still worth it because That's the only, that's the only scenario that will matter. Nothing else matters. But I

Malcolm Collins: think there's,

Simone Collins: there's an inherent human laziness that some people just have, maybe it's genetic where they're like, I'm willing to take those risks.

And like, I would rather just not think about it and rather not try. And it probably won't matter anyway. So.

Malcolm Collins: Okay. Okay. Yeah. So I'll, I'll word this scenario differently. Okay. You're, you're, you're in one of these three colored trains, right? Huh. And you're masturbating with great pornography in front of you.

And you know, you could Or you're just tired. But there's only a 0. [00:44:00] 0001 percent chance you're in the yellow train. Yeah. So the button's probably not gonna matter. Yeah. Do you press the button? I would say, yes, I have a moral mandate to press the button, but a lot of people might say, well, I, you know, if it's a small chance, I prefer masturbating.

I just don't care. It's like,

Simone Collins: or just like. I mean, is it going to matter, you know, like, even

Malcolm Collins: if it's an astronomically small probability, like most people

Simone Collins: don't think like you do, I genuinely think most people don't think like that. And they're just like, well, I guess I'm going to crash. And that's just it.

Malcolm Collins: It doesn't matter. Okay, so sorry, you just have to think through this logic. Yeah, but you're not

Simone Collins: thinking through that. The problem is you are only thinking in logical terms and my argument is

Malcolm Collins: the problem. They as the atheist now are actively choosing to think illogically. So that's my point. But then they might as well be a religious person.

No, no, because I, Malcolm,

Simone Collins: a lot of people choose to become atheists because it's the laziest choice. Not because they actually believe in [00:45:00] atheism or they believe in God. I

Malcolm Collins: disagree with this.

Simone Collins: There, there are a lot of people who turn to atheism because they, they care about facts and they see that it's not standing up that their religion is not standing up in the face of, of rigorous, logical scrutiny.

However, there's a lot of people who are like, oh, yeah, I'm an atheist because they discovered that maybe they don't have to go to church on Sunday anymore and they can drink and isn't that great. And then they're just an atheist. Eh,

Malcolm Collins: I think almost nobody does that.

Simone Collins: Mm. Ha ha. Another place I return to those commenting, what people do when they wanna say, weigh in, share your vote watchers slash listeners.

No, no. Hold on.

Malcolm Collins: I will tell you what happened to the crowd that you're talking about. Mm-Hmm. The crowd that wants to say. Oh, my religion has rules against drinking and gay sex and stuff like that. Well, just fuck those rules in particular. Like, I'm still but like those rules don't apply. The rules against drinking nonsense, the rules against nonsense.

Yeah. But sometimes

Simone Collins: people [00:46:00] choose to call themselves atheist in those scenarios. Instead of just like, agnostic or whatever, because they feel like it makes them more edgy or it has some cachet. And again, it's not based on logic. It's just based on social signaling and laziness.

Malcolm Collins: People who are religious and don't like the rules of the religion, do not leave the religion. They just changed the rules. If you look at a progressive church where you see the trans pastor with all of the, the body piercings and all

I was telling simone today these people who like attempt to larp these older religions they're just trying to copy the way things were done like a long time ago I point out I go but you can look at the statistics the people who do this You they are deconverting at like record rates.

Their kids are deconverting at record rates. The, the religions themselves are some of the most woke, you know, like the Lutheran pastor I'll play here, the woke things you could imagine

Speaker 10: Let us confess our faith today in the words of the Sparkle Creed. [00:47:00] I believe in the non binary God, whose pronouns are plural. I believe in Jesus Christ, their child, who wore a fabulous tunic and had two dads, and saw everyone as a sibling child of God. I believe in the rainbow spirit who shatters our image of one white light and refracts it into a rainbow of gorgeous diversity.

I believe in the Church of Everyday Saints as numerous, creative, and resilient as patches on the ace quilt, whose feet are grounded in mud, and whose eyes gaze at the stars in wonder.

.

Nam S to

Simone Collins: this whole time.

The call was coming from inside the [00:48:00] house.

Malcolm Collins: like, and they're like, Oh, well. Yeah, that's all true, but the only people who are struggling or suffering are the ones that have left the, the strict path that the religion sets out.

And I'm like, yeah, but that's people's kids. That's like saying the only people who died in this war are the ones who were shot. It's like, yeah, that's what it looks like when, when, when the system fails, when you die. That's what we're warning you against. If you do this, your kids will hate you and they'll castrate themselves and you will fail because that's what the statistics show happening.

And we're trying to help you here. But okay. Yeah, I mean, there's nothing you can do. When someone's stuck in their ways, they're stuck in their ways. They're just like, I have, you know, and we're not saying we have the right answer, but we're like, let's actually try for something that potentially could be the right answer.

Something that we know doesn't [00:49:00] definitely not work.

Microphone (Wireless Microphone Rx)-2: One of our followers recently on discord was saying, well, you know, I can't believe any of these weird things that Malcolm and Simone believe about God, because I'm a Catholic. And I would point out that while we were not familiar with his work, when we came up with these ideas, our work is. Very very similar to the work of the Catholic. priest.

Pierre Teilhard de Chardin who in the early 19 hundreds.

Proposed the concept of the omega point. Which he described as the ultimate stage of human evolution, where humanity and the universe converged with the divine. He saw technological and scientific advancement as part of this evolutionary process contributing to humanity, spiritual development and eventual union was God.

So, I mean, clearly not only can Catholic thinks this, but Catholic priest can sink this sort of stuff.

And Catholic priest can think this sort of stuff from over a hundred years ago.

And, , here, I'd also note another thing that the same person was talking about. I saying, well, I [00:50:00] believe in a perfect God and a perfect God is not a God who can improve. It's a stagnant God. And to me, something that is stagnant, can't be perfect. And a thing lacks the capacity to improve. That is something that it lacks, which definitionally makes it not perfect. And then somebody said, well, how can something be perfect. And in a state of improvement and you would say, well, that thing would have to exist. Outside of time and through time. Oh, like the way we see God.

Malcolm Collins: Malcolm,

Simone Collins: you didn't grow up where I grew up, where it is lower effort and easier to say that you're not Christian and that you're atheist.

So if, if I, for example, grew up in the Bay Area, in a religious family.

Malcolm Collins: Where you grew up is the ultra heart of the rot. It's, it's the Bay Area. Yeah, the rot has

Simone Collins: spread extremely far.

Malcolm Collins: And most schools, I would say, are largely atheist. Churches don't dissolve, they just give up on all their rules. We'll just let the

Simone Collins: commenters say what they think, okay?

Okay, [00:51:00]

Malcolm Collins: we'll let the commenters say what they think. I'm,

Simone Collins: I'm used to being wrong. I'm, I'm normally wrong, I'm probably wrong. But I don't think I'm wrong. And we were just. A very thoughtful listener had just written to us a long email about how one of our blind spots is that we, and by we, I think primarily, but maybe me a little bit, think too logically and don't realize the extent to which the world acts on vibes and sentiment and instincts that are not necessarily logical or thought through.

Malcolm Collins: Well, and we will eventually that portion of the population. Right, but that doesn't change what our reality

Simone Collins: is now, Malcolm. Done. Done.

Malcolm Collins: What I meant to say is we will opt out of a version of humanity that includes that portion of the population. What?

Simone Collins: You're so amused with yourself. I have to make dinner.

I think I'm going to make cornbread muffins because they'll probably go best with the slurry in its fresh state. What do you think? And then I'll make this mashed potato tomorrow. I [00:52:00] did, but if you don't like the rice, then you have cornmeal muffins.

Malcolm Collins: Okay. Well, I'm, I'm interested to try the rice.

Simone Collins: Okay. I just figure like maybe the rice and the meat slurry is too matchy matchy.

You know what I mean? Like it's too matchy matchy. Yeah. But you, you can't escape the flavor and it is, it has a kick. So you probably want something a little, like you need a refuge, you need a carb to sink your teeth into. Okay. I'll do some cornmeal muffins. Okay. We're on. If I have the ingredients, we'll find out.

I'm going to. Go down and whip something up and I love you.

Malcolm Collins: I love you to death. By the way, how is this camera coming through? It looks really good on my end.

Simone Collins: Yours looks great, yeah. And then the problem with this, this camera that I have is that it gets, the lights get flashy. Well, I'm thinking of getting another

Malcolm Collins: one of these cameras for you and to make sure you install the drivers.

So why don't you install the drivers to test it tomorrow?

Simone Collins: Okay. Not tomorrow because we have the kids for the next four days.

Malcolm Collins: The Black Friday sales are going to end. I'm not. Oh, okay. Then,

Simone Collins: then leave it in my room. Leave it on the plastic chair in my room. Yeah,

Malcolm Collins: we'll [00:53:00] do. Love

Simone Collins: you. I love you. Malcolm.

You're beautiful. I'll say you're pretty. Yeah. Oh my God. That smile. Yeah. The lighting you right now, this is like maybe having a room dark like this is the way to go too. Cause the lighting right now with you, you just look the shadows. Let me see every chiseled curve that I love so much.

Malcolm Collins: So many people hate what I look like and I don't understand why.

Like I look at my face. Not only do I like my face, but I think like worst case scenario, I think I look very average. I was just, you know, a

Simone Collins: YouTuber was talking about this there's this like issue with Starring in the wicked movie with someone else. Like cheated on, I think she was married. Is she, but she also like had an affair with a married man who just had a newborn and the married man like played SpongeBob SquarePants on Broadway.

And he's not like traditionally masculine. Like he doesn't look like Chris Williamson. Like we were talking about this morning. Right. And this woman was like, listen, like guys don't realize how much this like Creative, [00:54:00] expressive, like not beefcake man, like how much action they get. And here's this example of like another very high value female, Ariana Grande, like house, like, like, you know, like at great personal reputational risk, having an affair with this guy, like breaking up family.

Well, I

Malcolm Collins: think that people misunderstand the types of people. People, but that's not the point I'm making. I'm not talking about like general attraction. I'm talking about an aversion. There is a category of people who like have the power through watching us because they find us so distasteful. Well,

Simone Collins: they can just listen to us.

Oh, but then they find our voices.

Something we need to, maybe we can create like a VTuber alt channel where we both have just anime avatars and voice changers. So we both are

Malcolm Collins: like people who they'll, they'll love Chris Williamson. And they'll love, like, like Edward Dutton. And I'm like,

Simone Collins: I don't know what to say. I don't know [00:55:00] what to say.

Malcolm Collins: Where am I failing here? Right? Like, I can understand if you love Chris Williams and I'm not beefcake enough, or I can understand if you love Edward Dutton and I'm not like weird nerd enough. I can't understand you loving both of them and hating that, like, like liking my ideas, but like hating my vibe.

Simone Collins: Maybe someone can explain this to us. Maybe somebody can explain this. Email us at partners. Don't email us. Okay, fine.

Malcolm Collins: I wonder you you're going to do this month. I love you to death.

Simone Collins: I love you too. Okay. Bye Locked and reported did a podcast episode on, which I haven't finished listening to, but they did it on the concept of gentle parenting, where parents are like, no, Jimmy, we don't, we look at sticks, we don't hit people with sticks, and they like talk to their children and it's backfiring. Because what typically happens with gentle parenting is when a child [00:56:00] misbehaves, the parent then sits down and talks with them about why their actions were suboptimal.

And what are they doing? They're giving their kid tons of one on one attention whenever the kid acts up. So what does the kid do? They act up! Because guess what? All you have to do is break something or hurt someone. Oh my god, those

Malcolm Collins: moms who yelled at me said that's what I was supposed to do when the kid was acting up.

They're like, you need to just sit down and them. You need to reward

Simone Collins: them with attention and make them want to do it again. No, I let Toasty

Malcolm Collins: know, you, you, I

Simone Collins: do, I do

Malcolm Collins: brutal parenting.

Simone Collins: Well, no, no, no. The thing is, like, the concept of gentle parenting, like, in principle, I don't disagree with it. Because The concept is, well, you know, instead of like introducing stupid, lame, arbitrary punishments, often delayed, that are divorced from the, the action, let, let the child learn through reality what happens.

Of course, that's too dangerous if like, what the kid's doing, as they were saying in black to report it is like wandering [00:57:00] into. A stranger's van and disappearing because there's like, oh, you know, there's only one way that's going to end, then there's not going to be a second chance. But I think that what real gentle parenting then is like, well, if you act like an asshole, we're not going to like you and we're not going to give you things that you want.

Like, that's, that's a good point.

Malcolm Collins: I wish at a reticon you had gone much stronger with this and pushed back when people said that other stuff. You know, you should have just been like, no, you're going to raise weak children who are going to suffer.

Simone Collins: I know I, I wanted to be polite about the time constraint and you are correct that I should have just ignored it and

Malcolm Collins: just ignore it.

Time constraints don't matter. You push back, you make everyone else quake in your confidence in what an awesome parent you are.

Simone Collins: Yeah, yeah. I, I, I struggle. It's hard because when I'm in public, I'm so disassociated already. I'm like dead. There's nothing behind. Like, I'm just 100 percent on autocomplete and like, I'm trying my, my algorithm is, is to optimize around.

I mean, to a certain extent, I'm shifting my algorithm [00:58:00] to being like, honest about our views and about just being really clear about what we believe in and what our values are and causes are. But the base algorithm that I've lived with for the first, at least, like, 35 years of my life has been minimize conflict and please people and so and follow the rules.

And so, like, to. It takes, it's a big code base, Malcolm. It's like, what is it that company had? Like they named their code base. I think it was Square. They named it The Beast. And it had like one full time employee whose like only job was to manage it because it was just, there was so much code from like the original.

What does this have to do with it? I don't understand the

Malcolm Collins: point you're making.

Simone Collins: That my algorithm has a lot of baggage. And it's, you can't just like shift it all of a sudden and then like, it's, it's fine. Of course.

Malcolm Collins: Yeah. So you need, you need new code base. You're, you're the boss. I know. Well, it,

Simone Collins: I'm just saying it takes a while to, you know.

Malcolm Collins: Shift it. All right. Let's do it.

Speaker: You want me to take a video? Yeah. All right, what are you doing? [00:59:00] I'm doing this. Okay, is this entertaining or something? I think it's way too big in this box. Titan, what is that thing that you have there? Watch this, Daddy. I'm a machine. Daddy, you want to watch this. Okay. Daddy, watch this. I'm a machine. I'm a machine.

Speaker 2: I'm a machine. You want to see what's in the video? I just kicked that. Look, it's Octavian.

Speaker 4: Torsen, what are you up to? Watch this, Dad! Torsen, Torsen, Torsen, where are you going? Watch this! This crazy thing will happen. Ha ha! You see how this conduit in there? It got out of the poison slings. Ha! It really got out of the poison thing!