Humans love a good apocalypse. Give us a blockbuster about a virus that obliterates the population, an asteroid that wipes out the entire planet, or anything with aliens and we lap it up. But have you ever thought about what will actually kill us at the finish line? Sure, we’d like to think the zombie apocalypse will be the winner, but if we’re talking about plausible ways to exterminate humanity, what’s a good way to go?


The end of the world as we know it isn’t all fiction. Life on Earth has come pretty close to getting wiped out a few times actually. Genetic analysis shows humanity plummeted to perilously low numbers—about 1,200 breeding humans (yes, we are all related)—when intense volcanic activity in Siberia caused global warming and wiped out 96% of plants and animals. 


Another wild time to be alive was when intense glacial and interglacial periods created large sea-level swings and moved shorelines dramatically. The tectonic uplift of the Appalachian mountains created lots of weathering, sequestration of CO2, and with it, changes in climate and ocean chemistry. That wiped out 86% of living things.


And for all those gardeners out there, be wary. At one point long before humans were around, the rapid growth and diversification of land plants (AKA weeds) generated severe global cooling. This killed 75% of species on Earth. 


But life persisted. We might be inbred, but it seems total annihilation is harder to pull off than you think. The asteroid that wiped out the dinosaurs 65 million years ago didn’t completely destroy the earth, and humans managed to scrape through the black plague. So what could be our final end?


Nick Bostrom, a philosopher who's done a bunch of work on apocalyptic events, classified four types of apocalypse: Bangs, crunches, shrieks and whimpers. Very sciency. 


Bangs are the sudden disasters like asteroids, volcanoes or a colossal tidal wave -  those Hollywood blockbusters we can’t get enough of. Crunches are when human life continues in some form but it’s horrible and nothing gets any better. If the idea of a crunch hasn’t made you depressed enough, watch The Road. Trust us, you don’t want the crunch. 


Shrieks allow for some form of post-humanity but there’s an extremely narrow band of existence. A chosen few go on to become elite and powerful but there are vastly fewer of us. Think living in a TV series like The Peripheral and Altered Carbon - we don’t love it. 


Then there are whimpers, where a post-human civilization arises but life gradually, irrevocably, evolves into a state where everything we value disappears. Civilization as we know it no longer exists. It could be horrible, or it could be interesting like the recently released show Fallout.


Life on Earth has weathered volcanoes, asteroids and drastic climate change and made it out the other side. Could nuclear war be our final end? Or what about a geoengineering backfire where some billionaire blocks too much of the sun? Oops, slight miscalculation! 


Of course, there’s the pretty legitimate threat of AI, robots getting way too powerful and taking over, or being way too dumb and making stupid decisions about nuclear bombs and whatnot. Speaking of technology, what if there wasn’t any? A large electromagnetic pulse or solar flare could literally take out all of the computer chips in the world. The impact of that would be catastrophic, to say the least. 


It’s fun to think about the bangs and crunches, but according to Ethan Mollick, Associate Professor at the University of Pennsylvania, what’s likely to happen is the boring apocalypse. No one single cataclysmic event, but merely a series of smaller, cumulative disasters like climate change, resource depletion and geopolitical instability - kinda like what’s happening right now (gulp). 


But if an apocalypse has to happen, we want the interesting kind where we re-imagine ourselves. The whimper sounds like the best pick of the bunch.

 
 
 
  • [00:00:00] Will: One way of defining when humans started is when we started fire, you know, controlling things. 

    [00:00:07] Rod: That's when we became human. 

    [00:00:09] Will: That's one way. It's one way of thinking. So fire, but what it gives us is a little bit of a marker of when did we start becoming us? And it might be about a million years ago. Possibly up to 2 million. Now, over that time, there's been a lot of human development. You know, a few things are a bit different from a million years ago. But during that time, during that million years, three times, some people reckon, we've come close to the brink. Not just stuff getting a bit hairy, but things getting really we're fucked here. We were almost done. 

    [00:00:40] So there's debate on all of these, but some of them are a bit stronger. Some of them, you know, maybe not, but the first one seems to have been somewhere about like 900, 000 years ago, give or take a few 10, 000 years. 

    [00:00:51] Rod: Rounding error 

    [00:00:51] Will: before that time. ,There weren't heaps of us, but there was a few, there was like 98, 000 of us breeding individuals. I don't think that's counting children yet or old people. 

    [00:01:01] Rod: It really shouldn't. Have you met my breeding individuals? They're your children and they're four.

    [00:01:04] Will: And so sometime around then, it seems there might've been some sort of climactic interruption. Don't know quite what. But we lost almost 99 percent of the population. 

    [00:01:16] Rod: Almost 99.

    [00:01:17] Will: Almost 99 percent of the population. We came down. This is through genetic analysis. We can see this came down to just 1200 breeding humans, 1200.

    [00:01:28] Rod: So, so humanity is inbred. 

    [00:01:32] Will: Yes, we are. But we went from, you know, 100, 000 people down to just over 1000 people. So that was almost a million years ago. 

    [00:01:42] Rod: Easy to get a park. Less people to, I don't know, stock the shelves. 

    [00:01:46] Will: That was the first one. There's another one that they reckon happened maybe 70, 000 years ago, so much closer. This one might be associated with a volcano that erupted in Indonesia. 

    [00:01:56] Rod: That recently though. 

    [00:01:57] Will: Yeah, that recently. Now there's, some people are a little bit more skeptical of this, it can be seen in the genetic record and at that point we might have dropped down to possibly 3, 000 surviving people. So again, you know, population gradually builds back up and drops down to 3, 000. 

    [00:02:13] Rod: Let's inbreed again. 

    [00:02:14] Will: The most recent one, not as big, but maybe about 5, 000 years ago. So this is, you know, 

    [00:02:21] Rod: that's yesterday 

    [00:02:22] Will: not in living memory, but it's yesterday. Oh, and we've got writing. I think, I don't know when writing starts, but it's not Far off that

    [00:02:28] Rod: but it's the first time someone wrote down the word. 

    [00:02:30] Will: This one, we don't have full evidence of what quite happened and it doesn't seem to be a huge drop in population, but it is something interesting. A huge drop in male genetic diversity, where suddenly 

    [00:02:44] Rod: so we are one 

    [00:02:45] Will: It seems like where there was, you know, a lot of breeding between the men's and the women's of the time, suddenly we get this huge drop in male genetic.

    [00:02:52] Rod: So for a while, just the women bred? 

    [00:02:54] Will: I think so. And not with the men. Well, it may well have been, it was just keith was more successful. 

    [00:02:59] Rod: I want ya. And this is like, I got two mates and I'm better. 

    [00:03:05] Will: When I was looking at this, what they call the human bottleneck. So these are times when we as a population have lived through an apocalypse, basically. We've gone through a time when something has gone really bad. The world's gotten shit. And for a very long time, we were post apocalyptic and it's happened one time, maybe twice, maybe close to three times in human, not quite history, but human time. And that got us thinking, could it happen again? And if it could happen again, What would be your favourite version?

    [00:04:01] Welcome to the Wholesome Show. 

    [00:04:03] Rod: The podcast in which a couple of academics, obviously we've worked out exactly related, knock off early. Grab a couple of beers. 

    [00:04:11] Will: And Dive down the rabbit hole. I'm Will Grant. 

    [00:04:15] Rod: I am Roderick G. Lamberts. And we are not the same human. We are both related to keith. 

    [00:04:21] Will: We are all children of Keith. 

    [00:04:23] Rod: Finally, my name for my cult. Children of Keith. It's on. It's on. I knew it'd come one day. I didn't think it'd be today. This is so good. 

    [00:04:37] Will: All right. So 

    [00:04:38] Rod: the t shirts, I can already see seven designs. 

    [00:04:41] Will: Listener, you might not like the idea of everybody dying, but we love it as a species. Like we look, there's a movie or two or three where a lot of people die. Like there's movies where we were looking at the moment of the dying and also the moment after the dying and where it all gets bad and fucked up. 

    [00:05:01] Rod: And then how we continue to live, if you could call it living. I got a question about Apocalypses. Is it Apocalypses or Apocalypses? 

    [00:05:09] Will: Well, I went with Apocalots. Like if there's lots of apocalypse, it'd be an apocalots. So we're gonna in a second dive into, you know, what are some options out there? What might shit be like before, you know, because, 'cause Wes from the sciences, we gotta do some factual stuff first.

    [00:05:24] Rod: First physics 

    [00:05:25] Will: I will Come to physics 

    [00:05:27] Rod: I've often come to physics, carry on. 

    [00:05:28] Will: Well, first of all, there's been as well as, you know, those ones I mentioned in the human time where we humans have suffered a lot. I went and had a quick look at mass extinctions in the past.

    [00:05:38] Rod: Oh, your North umbrians 

    [00:05:39] Will: no these are the times when not just humans, but the whole planet has gone through apocalypse 

    [00:05:45] Rod: That's true, we're either all keith's or trilobites. 

    [00:05:47] Will: Yeah, we're children of Keith, children of the trilobite. Oh, that's a band, the children of the trilobite. 

    [00:05:52] Rod: Who work in the children of Keith. 

    [00:05:54] Will: No, so there's been five mass extinction events over the last 500 million years. There might've been more before that, but a, there wasn't as many species and be they didn't leave video. So we don't know but what do you reckon might've caused those five? 

    [00:06:10] Rod: What do you call them? Asteroids. 

    [00:06:11] Will: No, one out of five 

    [00:06:12] Rod: Then there was the virus the lizard virus, 

    [00:06:16] Will: so, the most recent one was the 65 million year ago one, killed the dinosaurs. 76 percent of species died. So it's the second smallest of the extinction. 

    [00:06:25] Rod: 76 percent is second smallest. 

    [00:06:27] Will: Yeah, that's the second. And that was the asteroid impact in Mexico. So that can clearly fuck shit up. That is a species fuck up level of event. The other thing on that remembering is that it might've taken a fair while for a lot to die, but a lot died very quickly in that one. Like within hours, like 

    [00:06:45] Rod: that's weird, isn't it? Planet wide within hours. And they didn't even need nukes. 

    [00:06:51] Will: You got any guesses for what the others might've been?

    [00:06:54] Rod: So there's only one asteroid. 

    [00:06:55] Will: One, one, one of the five were asteroids. 

    [00:06:57] Rod: Shift in the earth's axis. 

    [00:07:00] Will: Oh, I don't think so although 

    [00:07:03] Rod: magnetic field reversal. 

    [00:07:04] Will: Oh my God. That would be no. Okay. So, the first one, 44 million years ago, at the end of the old division. 86% died from it's the second biggest, intense glacial and interglacial periods created large sea level swings and moved shorelines dramatically and then tectonic uplift of the Appalachian mountains created lots of weathering sequestration. I don't know. It seemed to be, yeah, like climate change, but driven by the plates were moving a lot more. I don't know. I don't quite understand what 

    [00:07:39] Rod: hadn't been bedded down yet. Okay. 

    [00:07:41] Will: Two were mega volcanoes. 

    [00:07:43] Rod: Yeah, those are the ones that, that, that's probably the one that makes me the most kind of edgy. It's not easy to duck out of the way of one of those. 

    [00:07:51] Will: So, the big daddy. This is the end of your Permians, 250 million years ago, 96 percent of species like that is a cleansing. That is some global cleansing. Like I know that this is before God was really doing shit.

    [00:08:07] Like this is before Noah, like, but it's 96 percent is a Noah's ark sort of thing. We are brushing the species off into the ocean. 

    [00:08:14] Rod: That's a dose of salts and a large amount of fiber in all gone. 

    [00:08:17] Will: Yeah. Intense volcanic activity in Siberia. Cause global warming, elevated CO2, all of that, you know, basically it's like you get a mega volcano and so much shit gets spewed into the atmosphere. Like it's, is it really climate change or is it everything is so bad? 

    [00:08:32] Rod: As a warm, cuddly blanket. 

    [00:08:34] Will: And the other one was the beginning of the dinosaurs 200 million years ago. 80 percent died. So not the worst, but this was, this is a weird one. Underwater volcanic act activity in the middle of the Atlantic.

    [00:08:45] Rod: So they were killed by bubbles? 

    [00:08:46] Will: Yes. No this actually is, some people have called this the giant burp. 

    [00:08:51] Rod: Oh, seriously? This isn't good. If I'm going to make a joke and it's real, where's the fun in that? 

    [00:08:55] Will: From the middle of the Atlantic, like a huge release of gases that changed the chemistry of the oceans and atmospheres and a lot died.

    [00:09:05] Rod: 80 percent did you say? 

    [00:09:06] Will: Yeah, that was 80%. 

    [00:09:07] Rod: Seriously. So something went baboom and then chemicals. 

    [00:09:11] Will: Yeah. I got one, one last of the big mega extinction events. And I like this one because it's like, but they're meant to be the good guys. Land plants. So 36 million years ago, 75 percent of all species died because of the rapid growth and diversification of land plants, 

    [00:09:28] Rod: weeds.

    [00:09:29] Will: No, just, well, yeah. Weeds killed by weeds. We didn't have gardens back then, but basically plants started growing massively on land, changed the CO2 dynamics radically, and suddenly you get a rapid and severe global cooling. 

    [00:09:41] Rod: Okay. And so I assume creatures pre that who are really into CO2 got starved as well. Would that be right? Plants would have absorbed it and shut out. 

    [00:09:49] Will: Let's say it's all changing so radically, like you suck up all the CO2. 

    [00:09:52] Rod: So your cooch grass, your willows, and that Parsons curse, purple flowers, boom, killed everything. 

    [00:09:59] Will: But it does say, it does say, you know, clearly climate change is big in here, but asteroids, volcanoes and other. 

    [00:10:07] Rod: it's fair to say I've changed the climate in a few rooms independently. So let's make it planetary. No surprise. 

    [00:10:14] Will: So Rod and I, we've been exploring this week, just going, okay, well, what's a good, what's a good apocalypse? 

    [00:10:19] Rod: There are so many. I think about 15.

    [00:10:22] Will: When I was looking through, there was some nice classifications. You just said plausible. That's what, you know, like how likely something is like, clearly asteroids have happened. Volcanoes have happened. They're likely in the future. There's some that we ran across the plausibility is low.

    [00:10:35] I did see this other classification that was nice. This was from Nick Bostrom, who's a philosopher who's done a bunch of work on apocalypse events. And he classified four types of apocalypse. He's got the bangs. Which this is your sudden fast disaster deliberate or accidental, but this is your asteroids, your super volcanoes 

    [00:10:55] Rod: mutually assured destructions.

    [00:10:56] Will: Yeah. Your nuclear. So, a bang. So you could go out in a bang. It's literally over in Not could be in hours, but it could be, you know, geological time, it could be a couple of years or something like that. Then you've got the crunches. This is where the potential of humankind to develop into post humanity is permanently thwarted.

    [00:11:14] So we can't go any further, but life continues in some form. Human life continues in some form. Well, I think what this means, and I've got some examples. This is like, Shit's just horrible. We've lost most people. We've lost civilization. But there's a few people around eking out life and nothing is getting better.

    [00:11:34] Rod: I'm thinking of the movie. 

    [00:11:36] Will: Yeah, it's that one. It's that one. 

    [00:11:39] Rod: That movie, if you're not feeling suicidal enough or at all. Watch that movie. 

    [00:11:44] Will: All right. All right. That was the road. Oh, we'll come back there. But I think if I got a movie for each of these. So I was thinking that the bangs is like infinity war. It's like Thanos snaps his finger and like, it's just in an instant. And of course, plenty of others like the Mayan time calendar. 

    [00:12:00] Rod: I love that because, you know, I really love one of the reason like disaster movies, fucking colossal tidal waves. I love a colossal tidal wave. It's just amazing. You see a cityscape and then there's a wall of water, like eight times. Do you know that movie in the Himalayas? 

    [00:12:13] Will: Well, do you know how big the tsunami was after the the asteroid impact that killed the dinosaurs? Yes, it lands in Mexico and the tsunami. I don't have the height in front of me, but I know that it went across nearly all of continental North America.

    [00:12:27] Rod: Just go to higher ground. You heard the siren.

    [00:12:30] Will: I don't know. It might've been getting up to the Rocky Mountains or something like that. Don't quote that one. But it went thousands of kilometers. 

    [00:12:37] Rod: To be fair, and this is the bit I didn't think about until I narrowly avoided my first tsunami not going to Bali, many years ago. What we forget is, I'd forgotten, they don't have to be fast to fucking kill you. A huge slow moving body of water, if it's a little bit faster than you, and only has to be a little bit, you're probably in trouble. 

    [00:12:54] Will: So we've got the bangs and the crunches. So the bang is very fast. The crunch stops humanity from going any further, and we're probably eking out a shitty existence.

    [00:13:02] The shrieks. Some form of post humanity is attained but it's extremely narrow band of what is possible and desirable. Like a few of us go on to become, you know, this super powerful, super whatever, but there are vastly less of us. I was thinking here, the peripheral was a recent example. Like there's only billionaires left. 

    [00:13:24] Rod: Altered Carbon was kind of moving in that direction with the super serious people and then the underclass 

    [00:13:29] Will: I think so. And I think, you know, we've talked about this before about should people be able to live forever? There's some sort of version of that where some people get to live forever and no one else exists. And so that's the shriek. 

    [00:13:40] Rod: I think living forever could be fun or at least having a crack. 

    [00:13:42] Will: And look, it is saying that there is a species might continue. The last one is The whimpers, a post human civilization arises, but evolves in a direction that gradually, but irrevocably, to either the complete disappearance of things that we value, or a state where those things are realized to only a minuscule degree.

    [00:14:02] Rod: Is that called the great meh? 

    [00:14:03] Will: No, I would call that the great interesting 

    [00:14:07] Rod: slow and boring is interesting? 

    [00:14:08] Will: No, not necessarily boring. This to me is fallout. The recent TV show, so this is where civilization as we know, it doesn't exist. Humans are doing stuff. 

    [00:14:18] Rod: Different types of humans doing different types of things and then working out how we interrelate or don't 

    [00:14:23] Will: or don't or whatever it is. So it's interesting. So of them, I mean the bangs where everyone dies, that doesn't sound great. The crunches where shit gets horrible, also not great. 

    [00:14:34] Rod: I see no reason why the bangs and the crunches can't go hand in hand. 

    [00:14:37] Will: If we have to have one I'm leaning towards the whimpers. 

    [00:14:40] Rod: I'm sure I've talked about this in other episodes. I have epic dreams in general. I tend to have huge monster dreams. It's crazy. Like I wake up and I feel like I've seen three movies and it's like, whoa, that was amazing. And one of the repeating ones or variation on the theme is it's, pretty close post apocalypse. But I find it fascinating. 

    [00:14:56] Will: It's 10 years after the event 

    [00:14:57] Rod: or it might even be just like, it's more like the slow moving when we ah, shit's starting to collapse now. Okay. Let's get things together and sort out what might happen. And I'm, I always wake up like energized because it's fascinating. I'm like, okay, so we're going to do this. We tried that. We went over here and we did it. 

    [00:15:11] Will: We got tasks. 

    [00:15:12] Rod: Yeah, exactly. It's like jobs. It's like playing a game. And I literally, I wake up from every one of those. I'm like bit scary, but damn, I woke up interested. 

    [00:15:20] Will: So this is the possibilities for everything could end. What do you got? What'd you find? 

    [00:15:25] Rod: I've got some obvious ones first. So like, the virus gig is an obvious one, some kind of virus bug that spreads faster and weirder than the ones we've already seen. And we handle it even better than we did with the most recent one, aka 

    [00:15:38] Will: Well, I mean, going back there the black death the plague, bubonic plague killed a third of Europe. So, yeah, shit was not as good back then. 

    [00:15:45] Rod: But also shit didn't spread as easily back then. 

    [00:15:47] Will: That's true. So that's true so we can kill significant numbers. 

    [00:15:50] Rod: Oh, we'd be much better at it now. A black death now would really be impressive. So, and that then you've got the enclaves of like, there's a lot of post apocalypse stories about that, that the people who, for no overt reason, other survivors, for some reason they're immune and there are no other like, common characteristics and then you watch how those societies somehow work themselves out. So you've got very well equipped people, terribly equipped people who love it, like, and then it's like, all right, we got to reimagine. 

    [00:16:15] Will: I mean, obviously coming out of the pandemic, it's the one that we would all think about the most. And clearly super plausible. Obviously it'd have to be a pretty precise virus to not only a virus or bacteria or something like that disease, but to be Spreadable enough. I saw some people saying, okay, what if AIDS was as spreadable as the common cold or something like that? But, you know, something it's that combination of how fast you die from it, how fast you spread how guaranteed is the death 

    [00:16:44] Rod: and the quick dying ones don't spread as well.

    [00:16:46] Will: Yeah. Yeah. You've got Spread while you're 

    [00:16:48] Rod: yeah, you got to keep your your host alive and suffering for as long as possible. Otherwise you don't reproduce 

    [00:16:54] Will: Bunch of books and movies on that. Like, the stand was a super virus station 11, I think was like a super flu or something like that.

    [00:17:03] Rod: Outbreak obviously, but it wasn't, it wasn't quite an apocalypse. 

    [00:17:06] Will: All right. Well, you've got, you've gone virus. 

    [00:17:09] Rod: Well, I thought I'm starting more obvious. 

    [00:17:10] Will: Alright, well, I'll go to Super Volcano, because we've done that, and we know, you know, clearly, It happened in the past in the, you know, not too many millions of years ago.

    [00:17:20] Yellowstone, but there's other potential super volcanos. I saw something saying Krakatoa may erupt again or something like that. 

    [00:17:27] Rod: Yeah. We got through it the first time. 

    [00:17:28] Will: That was only a big volcano, not a super volcano. 

    [00:17:31] Rod: Yeah. The Caldera like Yellowstone park is like, 8 million volcanoes woven together. It's looks like it's just one giant zit hole, ready to go. I can relax him. Cause what do you do then? What do you do? They go, okay, we've gotten this, our technology for predicting volcanoes has gotten way better. We can do it 20 minutes ahead of time. 

    [00:17:52] Will: I think the only thing, depending on the amount of time. And so, you know, the asteroids, the other, you know, comparative, we go, okay, cataclysmic event, but we can probably see an asteroid. I wouldn't be surprised if in most likely scenarios, a world ending asteroid has to be pretty big. We'd probably like to see that a couple of years out maybe a decade out. Super volcano. You might not get any 

    [00:18:12] Rod: no, there could be one about to burst now. 

    [00:18:14] Will: Oh, you could go to the moon or something like that. 

    [00:18:15] Rod: Not quickly. You might get blown there. 

    [00:18:18] Will: The thing that I think about with the super volcano, cause that is the underpinning for the disaster in the road. And the road is the most depressing post apocalyptic movie.

    [00:18:27] Rod: It's impressive. 

    [00:18:28] Will: There's no one left except people that eat each other 

    [00:18:30] Rod: Oh no, and the child lives on. Lots of, one of the first movies, maybe the only movie I've seen where don't worry, his kid survives. And you're like, was that good though? Am I happy for the kid? The kid could have died so much more quickly and sooner. Oh, sorry. I've got to say spoiler alert. 

    [00:18:45] Will: There's spoilers in this episode, but that's a classic example of crunch. Like everything collapses down. And the one I've been thinking about is that's a scenario where the food pyramid eats itself up and which would probably happen. You know, we rely on photosynthesis for energy to come into our food system.

    [00:19:01] That's the thing. That's the thing. And. If there's no photosynthesis anymore, then the animals will eat all of the plants until they're gone. 

    [00:19:10] Rod: We'll eat all the animals until they're gone. 

    [00:19:11] Will: We'll eat the animals until they're gone. And so it's like apex predators last the longest, but they will be forced to eat themselves at the very end.

    [00:19:19] Rod: Yeah, little lion on lion action. 

    [00:19:20] Will: Oh my god. Eat up the food chain. That scares me entirely. 

    [00:19:26] Rod: It's a horrifying. 

    [00:19:28] Will: And you know, like, like we can talk about that in the abstract, but you know, it's literally represented in their pantry. You know, you start with some food and then you have less food. 

    [00:19:36] Rod: Then you got like, Hey, let's invite the neighbors over for dinner. No, I don't want that. I don't want a super volcano and I don't want to, was that the crunch? 

    [00:19:47] Will: That's a version of the crunch. Yeah, definitely. What else you got? 

    [00:19:51] Rod: Well, I got obviously nukes because it's easy to ignore it because we've been worried about other things for, you know, since nukes originally, you know, 70s particularly, et cetera, 50, 60, 70s.

    [00:20:02] But you know, the increase in, let's say strong men taking over countries with, you know, strong arsenals. I don't think it's at all off the table that someone's just going to fuck up. Someone's just going to do something fucking stupid. Like, yeah, some smallcock man wants to prove his power and, you know, reclaim Russia's glory or American great again.

    [00:20:23] Will: But you know, we have come perilously close, like there, there are stories out there about literally, you know, there was a false alarm in a Soviet system. There was an auto fire Soviet system and we were preserved by the actions of one hero, one person who said, ah, no, let's double check.

    [00:20:38] Rod: This seems like incorrect. And you're like, oh, thank God for you. One fucking man who had the nuts. 

    [00:20:43] Will: I think there's another story as well of, I don't know if this was American or Soviet, but nuclear submarine and when they don't get their comms check in with naval headquarters, every so often, they have to assume boom.

    [00:20:53] Rod: But you need, like, I mean, I think that's an upside, but like some people say that it's only one guy is a horrifying thing, but actually they are the ones who are going to say no, you know, the big boys in the bunkers in Moscow in Washington, et cetera, because they are remote.

    [00:21:06] The dudes were sitting in the submarine going, well, we're underwater. We don't know what's going on. And when we do this boom, and when we surface, we got Neville shoot on the beach. I don't, have you seen versions of that? 

    [00:21:15] Will: On the beach. No, I haven't. But this is the one that it's set in Australia 

    [00:21:19] Rod: because we were the last country that the radioactive cloud hit. So it was both boom and slow creep. And so there was a TV series last one I saw was years ago. Brian Brown was a star and so that show, and he was youngish, so it's quite a long time ago, but just the government starts to issue these, you know, safe and user friendly ways for families to slowly suicide.

    [00:21:38] And they just literally now, and then you'd see a screen and you watch the radioactive cloud just slowly creeping down. So people, they know they're fucked. It is coming. It is going to nuke your entire environment's going to be irradiated. There's no question. How do you spend the time? how do you end the time? Do you wait? 

    [00:21:53] Will: You work on your poems. 

    [00:21:56] Rod: The funniest thing in that one is Brian Brown. He ends up, what is he gets in like a Ferrari or something that he's nicked and he just drives really fast. And then I remember a close up on his face and him kind of going, and then he gets airborne and hits a billboard or something and dies. That's how he suicides. But I get it. I mean, like, Oh, fair enough. For me, the creepiest part is, you know, when the family of three, can't we lie down on the bed and take the pills together and just hug and they die and I'm like, I don't know. 

    [00:22:17] Will: I ain't a veteran of a post apocalyptic world, but I feel like you try.

    [00:22:21] Rod: I would think so too. But eventually to me, that's the creepiest when it's just like, okay, kids, we're just going to take your pills and lie down together. And I'm like, it's different fighting. 

    [00:22:31] Will: All right. Okay. I'll give you another one of the classics. Yeah. Obviously it's the bad robot of various sorts. 

    [00:22:37] Rod: So many types.

    [00:22:38] Will: You know, we've got, you know, the bad AI in the sense of it was good intentions, but badly programmed, you know? So it turns us all into paper clips or some version of that. 

    [00:22:47] Rod: Oh, yeah. The gray goo. 

    [00:22:48] Will: Yeah, gray goo. So, you know, a nanotech scale robot that can replicate and it's programming is just to keep replicating or potentially you've got the super villain type AI, you know, it doesn't like us anymore. Like the Terminator sort of scenario sees us as a threat. 

    [00:23:01] Rod: Do you know what I was thinking of on that though? what if you get the dumb AI? Like they're just a fucking dickhead who's just a borfing idiot who's like, because they evolve in some ways intellectually really quickly knowledge and so forth, but in terms of function and thinking about stuff, whatever that constitutes and pondering consequences, a giant child with huge power 

    [00:23:18] Will: and we put it in positions of power 

    [00:23:20] Rod: or just stumbles into it, like kind of a blundering dummy, like giving the nuclear codes to three year old is like, I don't mean to cause harm, but Oh, that's the AI that really worries me.

    [00:23:33] Will: Yeah. Well, look, that's the one that your chat GPTs and the Microsofts and the Googles are saying is the big threat. So you gotta take that one seriously at the moment. 

    [00:23:40] Rod: And the other evidence we have on that, the hallucinating AI. So the AI that doesn't mean any harm, but it gets dumb information bases it on bullshit and goes Oh, I'm about to be destroyed. You're like dude. No, no one 

    [00:23:51] Will: What else you got? 

    [00:23:53] Rod: Oh, yeah running out of energy related to AI, not because we can't generate it because we get to a point where if an AI is getting really excited We know how much energy is sucked out of server farms etc. They could accidentally Fuck themselves too, because we get to a point where we just can't grab enough energy quickly enough.

    [00:24:08] Will: So, I mean, one, one simple version of that is we may remain super constrained by the limits of physics. And so interstellar travel is something that you can only do slowly, you know, to get to the nearest star is going to take two or 300 years or something like that or whatever. And so we remain locked around this sun. And so you can work an idea that there is, there's a limited amount of energy that you can capture from the sun. You put solar panels all the way around the sun. You capture everything. 

    [00:24:33] Rod: That's Dyson spheres. 

    [00:24:35] Will: That's it. And so potentially if that's all the energy you're, I feel like we'd still be all right with that amount of energy.

    [00:24:41] Rod: But then again, I mean, if you told people 150 years ago, we're going to have so many cars that would actually potentially impact the planet. 

    [00:24:47] Will: And well, here you go. And so someone decides to make Bitcoin with all of that energy and there's nothing left over for the rest of us. 

    [00:24:54] Rod: So that's also partially that sort of big bang thing where you could flip over where suddenly you're like, Oh Jesus Christ. It's not that we don't know how to do it. We just can't do it fast enough and it gets sucked out of the system. 

    [00:25:04] Will: Okay. Oh, that's cool. 

    [00:25:05] Rod: Yeah. That's my own. I was just pondering that on my own little brain. 

    [00:25:08] Will: Oh, that's nice. I'm going to go for some weird ones now cause I like some of these well, this one's not super weird, but this one's a, this one's a super plausible and it sort of connects with your no energy. There's one version of electricity gone. Cause I can't quite see how that would necessarily happen.

    [00:25:25] You could lose a grid, but it doesn't change but the more likely one is computers are gone. And so, and a large EMP electromagnetic pulse or a solar flare could literally take out all of the computer chips in the world 

    [00:25:40] Rod: that would have an effect 

    [00:25:41] Will: that would have an effect. You think, I mean, every power station is down. Every car is gone. Every communication facility, every plane, suddenly we're in a radically different world. Yeah. Like I think, you know, in some senses that's one of the simplest. 

    [00:25:55] Rod: I saw that like on solar flares, actually, you might've caught this the largest one on record, the Carrington event. 

    [00:26:00] Will: Yeah. In 1880 or something. 

    [00:26:02] Rod: 1859. Yeah. There's something like it was enormous. And basically it's ignited telegraph cables and things, and you could see the Northern lights as far as Cuba. 

    [00:26:13] Will: As far as cuba! Cause we just had some whenever you're listening to this, some big Southern lights. But as far as Cuba, Cuba is close to the equator. 

    [00:26:20] Rod: Yeah. So they were saying it, it went off, but this happened in 1859, not a lot of chips. There's only three, but the the idea that same size event, if it could incinerate or ignite telegraph table cables, what the hell would it do with the stuff we have now?

    [00:26:36] Will: And you know, we have very little we can do to predict it, I don't think these things are very predictable much in advance. And B, hardening against it very small. I know that there are some military chips that are hardened, like you can protect them with lead or whatever it is. But that is like 0. 00001 percent of computers. 

    [00:26:54] Rod: And you'd imagine that. So all those functional day to day infrastructure's gone, but it's cool, we can control nukes. And you're like, sweet. Well done guys. Congratulations. We protected what matters like. So yeah, that's a real cracker because if we get another Carrington event with modern tech. My God, amazing. I almost want to see it because I'd be fascinated to see how strong it could be. Actually, most of these. I'm curious. 

    [00:27:16] Will: Surely if you are sitting there, no one says I want everyone to suffer through, but anyway, it would be an interesting thing if you could move your body throughout history at your moment of death and go, okay, I get to choose where it happens. All right. What else? 

    [00:27:27] Rod: Oh, yeah. Obviously the bees. Or pollinators, but particularly bees, was it Varroa virus? 

    [00:27:34] Will: It's a mite. 

    [00:27:35] Rod: Mite, yeah, sorry, yeah. 

    [00:27:36] Will: Although, just point of order, yes, it's a virus on the mite. The mite brings the virus. 

    [00:27:41] Rod: We're both right. 

    [00:27:42] Will: Both right. Finally. 

    [00:27:43] Rod: As often happens. And so it wasn't that long ago where there was truly big damage language around the idea that we're going to lose, you know, vast tens of percentage of our actual plant pollinators and people going a Dude, this matters. That's not off the table. 

    [00:27:57] And then what's the next step is, of course, the black mirror, little tiny artificial pollinators that turn into monsters.

    [00:28:02] Will: You know, the parallel here and it's the genetic diversity that we rely on for food is shockingly narrow. I mean, one version of this is like bananas. They're, you know, 90%, don't quote the number of the bananas. We eat a one type of banana. And if there is a disease that banana falls prey to, we don't have bananas.

    [00:28:20] But beyond that, like the the global food system relies on, you know, it's wheat, potatoes, rice, corn, maybe a few, but the staples, like shockingly amount, a huge amount of what we eat are those things. And within that there's small genetic diversity. It doesn't take very much to suddenly lose enormous percentages of the world's food crop.

    [00:28:42] Rod: Just think Ireland and potato blight or in the Pacific when they had troubles with taro. I think it was a, it might've been a parasite, but it was fucking taro up. And of course, this is a lot more mono cultural crop wise than many other places at the same time, but you're fucking screwed. 

    [00:28:56] Will: I think the interesting thing as well is you lose 20 percent of the global food system, then a lot of other things are happening at that point.

    [00:29:05] Rod: Yeah, people don't go to work and they don't like, you know, cook out dinner and deliver it to us via the hungry panda because they're worried about eating themselves. 

    [00:29:12] Will: What else we got? 

    [00:29:13] Rod: Generically climate, but that's a bit boring 

    [00:29:15] Will: I'll give you a climate one that's potential. The geoengineering backfire. So you know, the theory, you know, we go for some sort of geoengineering solution to solve climate change. You know, we try to block 1 percent of the sun. We accidentally block 40 percent of the sun. 

    [00:29:28] Rod: How much did you block again? Most. I thought I pushed 10%, but I pushed 110. We're actually giving the sun back energy. 

    [00:29:37] Will: And you know, this is the one that always still gets me is the like, you know, Billionaire goes, okay, I want to be the guy that solves the climate change. I'm going to, Oh, I can just do this or I'm turning into Russell Crowe's billionaire and I can put a couple of billion dollars into, you know, putting some cloud seeding or something like that. And they don't check the sums and suddenly some dickhead billionaire has blocked the sun and we are all dead. 

    [00:30:03] Rod: I've geo engineered to get rid of a climate change problem. You're welcome. What? They're dead. Now you don't care about it, do you? No, no whinges left, that's good. One of my faves that came up when I was rummaging around, the second Y2K bug. The 2038 one. 2038? 

    [00:30:22] Will: Is that when Linux goes? 

    [00:30:23] Rod: Yeah. Unix. They call it the Unix Epoch. And so that one, I mean, I know, you know, but others, so a lot of systems use 32 bit integers to store Unix time, which is basically that counts the number of seconds from January the 1st, 1970.

    [00:30:36] Will: Is that when we run out of seconds? 

    [00:30:38] Rod: Yeah. We run out in 2028, January 10, 2038 at 3 14 0 7 UTC. Okay. We've run out of room in that 32 bit integer system. So again, that could fuck the clocks of computers that are based on Unix. To be fair, we had, there's 14 years programmers. Change it.

    [00:30:58] I was actually bizarrely working in a Y2K related industry briefly in the buildup couple of years before 2000. And they're all about it. They're all about Y2K, Y2K. And I thought, okay, this sounds real. I don't know what it is, but okay, fine. I was like the graduate entry, they don't know why they hired me and it all seemed very busy 

    [00:31:18] Will: And this is the thing people use Y2K as an example, idiots use it as an example to say we spent too much time focusing on this and it was all that argument 

    [00:31:28] Rod: Nothing happened. You were wrong. 

    [00:31:29] Will: So, someone fixes a problem properly and it's like, Oh, it wasn't a problem in the first place. Was it? 

    [00:31:35] Rod: I worked in a bar as a bouncer with his manager. Who's like, he'd walk in and I remember one day I was running the shift. There's about eight of us bouncing, big crowd, really quiet. Bouncer just standing around. He goes, why are you wasting all this money on bouncers? It's like, you fucking idiot. 

    [00:31:49] 

    [00:31:49] Will: Look in his defense, that's where you end up with the American military, which is a giant military industrial complex. This is how we don't have wars, by having the giantest military in the world.

    [00:31:58] Rod: That is exactly what my job led to. That is exactly how it happened. That guy ended up running. But yes, it is that kind of thinking though. But there's no problems. Why would you have people to stop the problems? 

    [00:32:07] Will: So here you go. I'll give you another version. There was leverages on this. I saw some people talking about a potential apocalypse, which is a slavery, military, industrial complex 

    [00:32:17] Rod: with robots. 

    [00:32:18] Will: Well, it doesn't necessarily have to be but a government goes total police state, and they're a successful police state and they start taking over the world and they don't care for humanity. And they can successfully bring in more slave workers to go and fight and expand the territory. It's like, what if Hitler won everything and just decided to kill nearly everyone. 

    [00:32:37] Rod: Fuck I love alternative history stories. I like the idea, like the beauty of these apocalypse or post apocalypses is you get to speculate like what it, I mean, it's like, well, I like playing those big real time strategy games. What if I tweak that parameter and see what happens? So interesting. 

    [00:32:53] Will: Give me more. What else you got? 

    [00:32:54] Rod: Well, one of the ones I really liked. Someone turns off the simulation. And I hadn't thought of that till I read it in passing. It's like, Oh yeah, good call. Someone goes, Oh, it is a simulation.

    [00:33:02] Will: Where I read it was in, I think it was the, I was, it's in a paper I'll talk about in a second, or it might've been in the Nick Bostrom paper, and it was just, you know, just classifying a bunch of different risks, blah, blah, blah. And one of the most simulation turn off. And I'm like, Oh, excuse me.

    [00:33:15] Oh, so, so that, you know, there are a lot of people out there that believe potentially we live in a simulation that is run in a computer in another part of the multiverse. 

    [00:33:23] Rod: A lot of people, you mean some 40 percent of humanity. 

    [00:33:26] Will: No, not really. I think it's a pretty dumb idea actually. I know, I think, you know, in their defense, they're like, mathematically, it has to be. In the sense that if, you know, if there are simulations, then the odds that you are living in the first universe versus the many trillions of simulations are very low. So you live in a simulation.

    [00:33:44] Rod: Oh, look, maybe we do, but it's quite realistic. I mean, you seem real. 

    [00:33:49] Will: I feel like that's a pretty lame apocalypse scenario cause there's no post apocalypse.

    [00:33:52] Rod: Why have a simulation? What for? Fun. 

    [00:33:55] Will: Well, for science. No, this is higher level beings that are doing science on what, well, it's sort of a sociology thing. This is like my sociology undergrad project. Run a simulated universe for a few years and see what happens.

    [00:34:06] Rod: Did you do well on that? 

    [00:34:07] Will: Yeah, I built a whole simulation of the universe. And all of them, all of the people in it developed souls. 

    [00:34:13] Rod: Are you an avatar of that human? 

    [00:34:16] Will: So I mean, do you develop a soul if you're in a simulated universe or? And does the soul escape the simulated universe? Or is God sort of living in a higher level in the simulated universe and God gets turned off too? 

    [00:34:28] Rod: That's a version when I was speculating on seeing, if we go from the simulation, One of the ones I can imagine is where God goes, all right, you got me. I'm real. And you guys, anyway, like close to the simulation. It's just like, yeah, I exist. And man, you got a lot of shit wrong. I'm just going to start again. 

    [00:34:45] Will: I kinda, I like it, but it doesn't give me anything. It's like, what do you do about it? How do you write a letter to the people up there and say, Hey, please don't turn us off. I feel like. If you are a high enough being to have science that can develop a simulation, you also have ethics and you're like, and you're also like, Oh no, probably I have developed this thing and I feel kind of bad that I did it, but I've got to keep running this thing.

    [00:35:07] Rod: And the question then is what's the cost to them if they keep us running? 

    [00:35:10] Will: And that's the thing about the simulation argument. I mean, no, but that says there is zero cost to them. Clearly. There is an energy usage for the computation 

    [00:35:18] Rod: back to server farms. 

    [00:35:20] Will: the amount of computation you need to run an entire universe simulation would be on the order of close to the amount of energy in a universe.

    [00:35:27] Rod: Not if you use advanced quantum multinomial energy, 

    [00:35:33] Will: energy is always the thing. I, this is why I don't believe the simulation thing. I just don't see that there is enough energy to sustain all of this calculation plus over a trillion different instances. 

    [00:35:43] Rod: Given what we understand of computational power, the idea of it is fucking phenomenal as we could. Phenomenal. 

    [00:35:49] Will: There, there is one. Okay, I got one that I like. Physics change. So we know that there are a number of people during the Manhattan project who had significant worries that, okay, this is not a physics change, but this is a, what if the bomb goes off, you know, the first one, the little boy and it just sets all of the oxygen in the world on fire and we all die 

    [00:36:11] Rod: and goodbye atmosphere 

    [00:36:12] Will: plausible. Fascinating. Didn't happen but enough people worrying about it, but then there is the other, you know, the step beyond that is the large Hadron collider. We had a bunch of people saying. Hey, what if this creates a tiny black hole and sucks us all in?

    [00:36:27] So that's not a physics change either, but that's a, you know, catastrophic intervention of experimental physics into our local world. But I saw some, what was there one an idea of collapsing the gap between particles at the local level.

    [00:36:42] And I saw something describing this as a, as an expanding light speed bubble throughout the galaxy that could suck in enormous volume. And I was like, damn, but like, there are some cool ideas that if something changes about physics, and I don't know why something would, but it could have intensely catastrophic and intensely rapid effects.

    [00:37:02] Rod: This, actually, there was a great fucking sleeper movie. Like, might have been 80s? New Zealand sci fi movie. The Quiet Earth. And basically the premise was this dude wakes up, he's in normal New Zealand, but since there's no people around, he's like, what the fuck's going on? And he starts running around looking for people.

    [00:37:17] It turns out luckily he's a physicist and they're working on some experiment. It's not quite clear what it was, but it turns out it changed fundamental elements of the universe, like the charge of an electron or something like this. It turns out this guy just at the point, he was worried about the the experiment going wrong.

    [00:37:34] So he took too many pills to kill himself and he died at the moment the effect happened and it turns out the only people who lived through the effect of the people who were dying in the moment. 

    [00:37:43] So he found two other people after a while. He went insane. He was just doing whatever he wants. He's driving around all the resources exist, plants and animals. Finds a woman and another dude and at first it goes fine. You're like, there's going to be problems. 

    [00:37:54] Will: And then it's a love triangle. 

    [00:37:55] Rod: Yep. And then there's a fight, blah, blah, blah. And he's doing calculations going, shit, it's going to happen again. It's going to get worse. And the big boy who he finds is in love with a woman as well. They start fighting, the big boy's beating him to death and he smacks him in the head and kills him just as the next effects happen. So he wakes up in a post apocalyptic world where he survived too. And the odds on there being any more people now. Fucking zero. Like maybe there's one on the other side of the planet.

    [00:38:19] Will: There was that guy that lived through both Hiroshima and Nagasaki. 

    [00:38:22] Rod: Get serious. 

    [00:38:23] Will: No, there was a guy. Like, like literally, I don't know what, I can't remember exactly the story, but he was either from Nagasaki and he was going to Hiroshima for work and turned up on the outskirts, saw the bomb and went, Oh. Oh. What else you got? 

    [00:38:37] Rod: There's one I found, there's interesting ideas of success that leads to failure. So the two that I saw, thanks to a new scientist, one was the idea that, and people have promoted this or proposed this we become, you know, super interconnected with electronic intervention to our brains itself. Which is great, but we end up actually abolishing the notion of the individual, become one homogenous Lump of beings. 

    [00:38:58] Will: Does that mean like we've evolved to sort of a post human as we understand it? 

    [00:39:02] Rod: Plugged in to become one. So not necessarily evolved, but suddenly, you know, we start plugging in. We were able to interact with each other at the speed of, you know, at least sound or light, but then suddenly we kind of, we become what one entity. We don't have individuality anymore. Is that an apocalypse? Is that the end of us? It's certainly a dramatic change in us. 

    [00:39:20] Will: That's a good one.

    [00:39:20] Rod: So that's a success thing that may go, if not sideways, let's say in unexpected directions. And another one was the end of disease was like, great. We imagine we actually end diseases. But then the flow on effects, if we end them too quickly, suddenly no diseases are killing people so there's still accidents, et cetera. What are the ramifications of that? I mean, it's back to, you know, ultra populations, you know, the old never die. We keep reproducing. We run out of resources. 

    [00:39:44] Will: So in a sense, it's an apocalypse bred by high population by a dystopian sort of thing, 

    [00:39:49] Rod: but kicked off by what would be unambiguously a success.

    [00:39:52] Will: Well, I'll go another version of that. there've been right wing people worrying about this kind of thing for ages. And I think it came a little around the IVF when IVF was first developed. Technological interventions in human gestation. Key thing here. And so one version is that via having more babies born through cesarean or born through other technologies, potentially we might be in a place where we wouldn't be able to have babies in a post apocalyptic world. So I think this is a combined version. You know, we evolved to have giant heads like yours and then we wouldn't be able to survive, you know, if we don't have technology around us.

    [00:40:27] Rod: I don't know how my mother survived this head, but somehow she did. Look also, I'm hoping if it's going to be, if it's going to happen, I want it to be interesting. And so I was thinking like extra dimensional shit, aliens land and make it weird, you know, 

    [00:40:41] Will: aliens landing. Yeah, of course. 

    [00:40:42] Rod: Yeah. But also if we kind of, you know, we talk about infinite universes or, you know, there's extra dimensions within string theory and unfolding. If we can actually play in those realms or at least see them, et cetera, I would be a hundred percent full bore behind any research into that, but that could also have cataclysmic or unexpected side effects. 

    [00:41:01] Will: So, so you're saying we get a researcher who says I can go through the multiverse. And, you know, eight days later, we're like, Oh, okay, we shouldn't have done that. That was a terrible idea 

    [00:41:09] Rod: or the whole, you know, okay. If there are five dimensional. Entities are we, but shadows to them. And is that a good idea? Like if we were exposed to greater multidimensional, more dimensional stuff, does that fuck with the notion of existence? Or does it collapse us? These are just speculations. Speculations. 

    [00:41:26] Will: Well, I got one in the opposite direction. They call it in the technical term, dysgenic. So opposite of eugenic. So eugenic would be you know, Social intervention had to have the best genes going on.

    [00:41:36] Not good. Not good. Yeah. Dysgenic is not necessarily interventions, but pressures that make the worst genes continue. So this is idiocracy. So there are people that worry about this enough to make a movie, but you know, people that think about this and go, Oh, but what if the pressures are to have Lots of dopey babies. What happens then? 

    [00:41:56] Rod: Well, it's not that it's, but it's like, who has more babies? It doesn't tend to be us wealthy wankers. 

    [00:42:00] Will: Oh my god. This is bullshit. 

    [00:42:02] Rod: That's the problem. How many babies do I have? 

    [00:42:06] Will: It's not because of your brain size though.

    [00:42:08] Rod: It's because of my brain size. It's like, I would breed, but look how intelligent I am.

    [00:42:13] Will: I kind of, I think it's plausible to go, okay, we could get to a way of civilization working that we're suddenly choosing like more and more, we're choosing dumber and dumber people. Like, well, in fact, we have, there is some evidence that The two key pressures that we've seen in human evolution over the last 5, 000 years are in lower population centers.

    [00:42:34] So either hunter gatherer populations or people now that live in smaller communities. Potentially you're dealing with more chaotic environments and the pressures there might be about intelligence, but in the other side, in urban environments, the pressures are much more about surviving disease. Cause you've got a lot of people around you, and so yes, and clearly in urban environments, there, there is success factors about being smart or being strong or whatever, but disease is the biggest one.

    [00:43:01] So basically swap it around. You get a a kid from Papua New Guinea or a kid from the middle of Beijing or New York and the selection pressures are a little bit different and potentially the urban world is evolving to cope with disease. And the world where we came from was about intelligence. So maybe there's a theory that we're choosing Better at surviving in giant packs, but not necessarily as smart. I don't know if that's true. 

    [00:43:23] Rod: I see the sense of that. Cause I often thought about this. The more I stepped through universities and deciding to become a clever clogged knowledge economy guy, I was thinking if the shit went down, I had no skills, I had nothing. Like, what can you do? Well, I can reason why the arguments between these two warring tribes who one has corn and one has weapons, I can reason why they may have different points of view, but can I fix that car? No. So, so like, yay egghead. Fucking useless. Useless. 

    [00:43:50] Will: I guess you would say the pressures might have been on us for the last 2 million years to get smarter, but that pressure might not be in the same direction. 

    [00:43:58] Rod: We may have outsmarted ourselves, so to speak. 

    [00:44:00] Will: There was a couple of other ones. A little bit like the virus. Cause they did make The Last of Us all about this. This is fungus, a mushroom or a fungus that fucks with what we do and I did like the idea. So a virus might just keep us normal but sick and dying and we pass that on but the idea of a pathogen it could be a fungus or a virus or a bacteria that, What if it changed human instincts and human behaviors, human desire, you know, so instead of cooperating and talking with each other, we become vicious, eat each other sort of animals 

    [00:44:33] Rod: weirdly, overly cooperative, but cooperative in ways that we wouldn't consider to be excellent. So it changes our fundamental behavior or drives. Yeah. I don't know. Maybe it's because I'm an ignorant ding dong, but that sounds scary.

    [00:44:50] Will: There's a couple of others that I thought day of the triffids is, oh yeah. There's a bright light from aliens or plants 

    [00:44:57] Rod: or meteors. The original one people like there was meteor showers and people went, we must look at it. 

    [00:45:00] Will: And everyone goes blind during due to an effect. Now, clearly, you know, people looked at the first nuclear explosions so you're going to look.. And, you know, it's not impossible you take out a key human sense and on mass, all of a sudden, a lot of things fall over. You know, if, even if 1 percent of the population goes blind in a night, it would be difficult, but it's not the same sort of thing.

    [00:45:22] So I thought that was, that's a fun one. And I think, you know, another one, it's like children of men. And so this is where we don't have children anymore and you could imagine what does it take to not have children? Is it a, you know, some sort of virus or something that we don't even detect, but it ruins fertility for men or women or something.

    [00:45:40] Rod: And then what it does to the structure of society when breeding becomes this outrageous and unusual property or ability, I don't know what you call it, function. Yeah. That's shocking because of the social shifting. Now we're back towards the handmaid's tile and all that shit like blah. Well, we could do some terrible things.

    [00:45:58] Will: I'm going to end on the last one though. 

    [00:45:59] Rod: Yeah, we're going to do it. We got to do it. 

    [00:46:01] Will: Because, you know, right after this, you know, we're going to have to talk about which one's your favorite, which one's the most likely, that kind of thing. Yeah. But the one that reading around and I was like, Oh, this one is the most like, it's the one that they ain't going to make movies about.

    [00:46:14] Rod: Do you know why it's most likely? Because it's happening now. It's happening right now. 

    [00:46:19] Will: And they ain't going to make movies about it because even in the paper, like the paper that this comes from is by Hinyan Liu, Christian Loata and Mathias Maas. It's called the boring apocalypse.

    [00:46:30] Rod: We can't even get that right. We're going to wipe ourselves out. Let's make it the shittest version. 

    [00:46:34] Will: I've got to say for an academic paper with the word boring, literally in the title, cool paper, like cool. Like you did well. 

    [00:46:40] Rod: So what is the boring apocalypse? I'm going to guess the boring. 

    [00:46:44] Will: Well, you know, most movies focus on, you know, your single origin events and there's a lot of, there's a lot of disaster planning thinking in academic government circles about these big things, you know, either a big nuclear event or a big foreign government or, you know, we think about those things a lot, but we don't combine them.

    [00:47:04] We don't think about how they might all join together and maybe, and there was a little bit of this just a few years ago, you get a pandemic. And then you get some bushfires, and then you get an earthquake, and then you get something else. And the thing that happens is your societal, governmental capability to adapt and take care of it actually falls apart

    [00:47:30] Rod: or rather is proven to be inadequate in the first place 

    [00:47:32] Will: and is inadequate. Yeah. And so the one in literature that this comes from, and I mentioned this before the peripheral. So this was one I described as a a shriek where some form of post humanity is attained but it's an extremely narrow thing. But basically He calls this a jackpot where just a rolling cascade of super disasters happen and we don't have the capability to deal with any one of them. So, you know, it could be in one country, it's giant fires in another country, it's giant floods. Then you get an earthquake, then you get some wars because once, once these things are happening, you're fighting over resources more.

    [00:48:05] And gradually, you just get fewer and fewer people. You actually get things really falling apart and it doesn't make a movie. And the thing that got me as I was reading about this, you know, we've developed, this is from Ethan Mullock on Twitter, who's commenting on the paper, but I think similar things in the paper, we've developed successful hardened systems to lower the risk of apocalypse based on catastrophic failures like nuclear weapons or plant accidents, you know, but those aren't perfect, but we kind of know how to think, okay, what if there's a catastrophic earthquake, catastrophic tsunami, or any of these kinds of things. 

    [00:48:38] Rod: And it's one off, right? Like what if something goes boom, that's it. One thing. 

    [00:48:42] Will: But the more that you harden against one thing means that you can't adapt to deal with the thing next door or the other thing over here. 

    [00:48:49] Rod: What you mean is if you harden against one thing, you soften against others.

    [00:48:52] Will: Become water. You know, I think this is the thing that it's doesn't make a disaster movie, but the idea of the boring apocalypse that we have to actually recognize that we might just have a whole syndemic of shit waving over us. 

    [00:49:07] Rod: Leak into oblivion.

    [00:49:10] Will: All right. I want to know, what's your favorite? What's the one what's the one that you want to take? 

    [00:49:14] Rod: You know, I just want it to be interesting. I just want and particularly to be honest, it's not the apocalypse itself. It's, I want the post apocalypse to be interesting. 

    [00:49:21] Will: Okay. So you don't want, you don't want a bang

    [00:49:23] Rod: no, I want the one where we all start to reimagine ourselves. What was that? The peripheral, the whimper.

    [00:49:27] Will: Okay, well, no, the shreak is a post human sort of thing? 

    [00:49:31] Rod: No. The fallout kind of situation where shit goes wrong but then there are multiple ways to imagine how you might do it. So it's a reconfiguring for me, it's a new broom. The broom could be horrifying and lead to spikes and guns. 

    [00:49:43] Will: Brooms are horrifying. We don't eat the little bits on the ground. We're terrible. But some stuff gets missed. 

    [00:49:47] Rod: It would be interesting. And also the thing I love most about that kind of apocalypse or post apocalypse.

    [00:49:51] Will: And what causes that then? 

    [00:49:52] Rod: So many options. Well, I think it is the boring. There's just a sequence of events or radiating coincidence of events that things just start to fall over and we have to redefine ourselves more locally and work out what we do. 

    [00:50:03] Will: You want the chips one?

    [00:50:05] Rod: Yeah, maybe 

    [00:50:06] Will: and the thinking machines and then we end up in Dune, but that's a thing that doesn't kill people instantly , but it ruins our ability to operate in much in the global sense. We're operating much more local 

    [00:50:17] Rod: And it really changes the way we ponder ourselves and where we fit. And I quite like that because I think it's exciting. Also, I mean, maybe because I'm just feeling a bit tired at the end of semester and all that, when we're recording, but it means all that shit disappears, like all your day to day stuff. It's like, well, that's gone now. Cause it doesn't matter. It's like, Oh, that sounds relaxing.

    [00:50:34] Will: Wholesome Verdict, choosing the apocalypse scenario here based on when you have to get your marking. 

    [00:50:39] Rod: Yeah, I'm a bit tired. I wouldn't mind. Could we like nuke a bunch of shit so I don't have to worry? 

    [00:50:44] Will: It's a tism song, you know, the chances of an asteroid coming in the next 10, 000 years, I'm banking on it becoming before my end of year exam.

    [00:50:50] Rod: It's exactly that. I'm like, ah, I've got all these deadlines. And what if something went really wrong and had to reconfigure it, you know, based on day to day survival stuff, that sounds more relaxing right now. What about yours? 

    [00:51:02] Will: Look, no, I've got to be, I've got to be close to that. I don't want any scenario where a narrow slice of us are becoming perfect and ruling the, you know, I don't want that.

    [00:51:11] If we have to have this thing. No, even me. I don't want to be that guy. But even if we have to have this thing, of course, I don't want everyone to die in a second. And of course I don't want everyone, you know, the few people who don't die in a second to live horrible lives until, you know, they die 50 years later and I don't want billionaires to become super. It's so, it's gotta be, it's gotta be that version that. I don't want people to die, but if we're picking an apocalypse, I want, you know, we get flowers of creativity. 

    [00:51:40] Rod: Great. Re imagining. Cause you don't want to do your marking either. Wholesome verdict. We just don't want to do our fucking marking.

    [00:51:48] Will: Oh my God. Anything else you've been thinking about? 

    [00:51:51] Rod: Yeah. Yeah. Yeah. Look, weirdly, I didn't realize until I wrote them down two things and both of them money related. What? Okay. One is can animals be taught to use money? Or the equivalent thereof. 

    [00:52:03] Will: There's experiments that he tried? oh my god. 

    [00:52:05] Rod: I know the answer, and the thing that I like more about it is Some of the ways they very quickly chose to use it are quite unexpected. 

    [00:52:14] Will: Animals. You're great. I love you. I love you. You're the best. That's awesome. 

    [00:52:18] Rod: Yeah. Yeah. Like I started looking into it and I was just rummaging around as we do for this show and otherwise. And then suddenly I went, Oh my God, they chose that version. Holy shit. 

    [00:52:27] Will: That's really cool. I've been thinking about the politics of first names and what I mean, given names. And mainly because you know, a friend of ours has mentioned that oh, he grew up in Iran and the, there was a shift in baby's names before and after the revolution.

    [00:52:43] But I did read, I was reading just the other day that in the Soviet union, there are a bunch of kids named october after the revolution, but there was something named after the five year plan by let the guy like, I'm just like, you named your kid five year plan. 

    [00:53:03] Rod: Second report on the annual crop rotation.

    [00:53:08] Will: So I was wondering, you know, is there is like, there must be some trends in going, okay, not just, you know, we like this name less or more, but actually there's some driving forces why we're choosing five year plan or Oliver as our names. 

    [00:53:22] Rod: Christ, I want to be called five year plan because you imagine like, I mean, I'm pretending I'm 20 again. It was like, and what's your name? It's like five year plan. And she'd be, she'd lean in and go, that's weird. Yes, I am. You're going to keep talking to me. 

    [00:53:34] Will: You don't know. Maybe she's called five year plan too 

    [00:53:36] Rod: but you've got something else in common, but the feminine five year it plan. Yeah. Is that how it works?

    [00:53:44] Will: No, they take the, it's the patronyms or something. I don't know. I don't know what else you got. 

    [00:53:48] Rod: Another money one. And I think about this a lot too, given my day job and everything else. And, you know, we have all these great intentions and we want to change behaviors and make people better. But in the end, I want to know, I said it was money related.

    [00:53:59] Should we just stop fucking around? Is money just the best motivator? Ultimately, we want to change behaviors and stuff. Is it just about money? And there are many studies that sort of ponder this from different angles. I'd like an excuse to read more and talk about it. 

    [00:54:11] Will: Well, I do an exercise in one of my classes. It's a simple storytelling exercise. Yeah. It's where I ask for a volunteer and I say can, volunteer comes up. I say, bring up that glass up there fill it with water and say, you carry it to wherever, put it down there. And if you get there without spilling a drop, I'll give you something and I've played with this over a while when I first did the exercise.

    [00:54:32] I thought I'll make it bad. You know, if I'll put a curse on here, I'll stab you in the face. No. Like you failed your PhD. And then I realized, hang on, this is as close as you can get to putting a magic spell on someone. 

    [00:54:43] Rod: Also, or they're going to go. You can't do that.

    [00:54:46] Will: Well, yes. And so it did it was either hostile or didn't work or both. Okay. Yeah. So then I thought, okay, what's a nice simple bit of value? I wanted to make it real. A chocolate bar. I thought, you know, you know, it costs a dollar or whatever. And you go, okay, this is something dollar 

    [00:54:59] Rod: you're adorable, 19, three musketeers bar, which we can't get any more 19.

    [00:55:04] Will: Do they put DDT in it or something? It's full of asbestos. 

    [00:55:07] Rod: The asbestos chocolate bar, asbestos bar, only the wrapper. 

    [00:55:10] Will: But then I thought, okay, chocolate bar. But then what if someone's like, Oh I'm dealing with some, you know, like I'm vegan or whatever. And it's like, that, that, that is not neutral.

    [00:55:21] And then I thought, well, you know, the simplest one is money is a neutral value that the only people that don't like it full hippies and even then money is what we have decided to be that neutral value. And so is it a motivator? whether it is or not, it is as neutral as you can get.

    [00:55:40] Rod: And part of the reason that drew me to that as a question is, you know, things like, I haven't read the details yet cause you could decide whether they'll do it, but things like weight loss and sustained weight loss. Fuck it, we'll pay you. 

    [00:55:50] You know, here's some motivators, and here's some stuff about your childhood, and here's some little gamified apps, etc. It's like, tell you what, you lose this much, we'll pay you this much. You keep it off for six months, we'll give you this much. And you know, there's a part of me wants that to be true, because it's simple, and part of me would be horrified if it's true, because Jesus Christ. 

    [00:56:06] Will: Oh my God. I'll give you a money one that I've been thinking about, because What does it mean to be the dominant currency?

    [00:56:13] We know for at least the last, well, since world war two, the U S dollar has been the world's dominant currency. Some people say it may be replaced and there's two other potential threats, the Chinese Yuan or the Euro, potentially becoming the dominant part, but we've had other dominant currencies in the past, you know, so the British pound was a dominant, tulip bulbs and the tulip bulbs and I don't know the Roman gold coin but what does it actually mean?

    [00:56:42] Why do you fight to become the dominant currency? And why is that, you know, still such a hidden bit of US foreign policy is to maintain the dollar as the prime currency in the world. What does that mean? 

    [00:56:55] I don't think it's pride. I think it's, I think it's actually functional. It's actually functional power. and it's not clear what it's somehow like the debts all come back to us or something like that. 

    [00:57:04] Rod: And weirdly, it still works. Like I was in Europe last year, got in a cab, got our card out to pay and they went, no, we can't do it. And I was like, fuck, I don't have any local currency. And I went like pat my pockets. He said it's 12 year. And I said, Oh, I've got 20 us. And he said, that would be fine. He didn't even flinch. And I thought fucking Paris and of all places who are not into that, they were like, yep, no problem. 

    [00:57:25] Will: So, you know, and that's because I think this is something that manifests down literally at the local, like people choosing it and going, okay, I'll take that all the way up to the mega governments. And they're like, this means something. What does it mean? 

    [00:57:38] Rod: It is interesting. Money, eh? 

    [00:57:41] Will: Money. Anyway, if you have apocalypses that you want like if you're planning one, Let us know. Put it in the comments. Tell someone. Tell someone. Tell us. 

    [00:57:52] Rod: Tell us at, cheers@wholesomeshow.Com. And tell someone else too who can stop you.

    [00:57:57] Will: If you've got other apocalypses or any other concept that you're like, Fuck, I want some academics who are taking the time off work to drill into, then you call us. We'd like the academics to do that for you. See you later.

Previous
Previous

Next
Next