Mon. Jul 1st, 2024

Is the possibility of human extinction this century truly 1 in 6?<!-- wp:html --><p><a href="https://whatsnew2day.com/">WhatsNew2Day - Latest News And Breaking Headlines</a></p> <div> <p class="paragraph_paragraph__3Hrfa">In 2020, Oxford-based philosopher Toby Ord published a book called <a target="_blank" class="Link_link__nE06W ScreenReaderOnly_srLinkHint__83_S_ Link_showVisited__gmCxW Link_showFocus__0kDeK" href="https://theprecipice.com/" rel="noopener">The precipice</a> on the risk of extinction of humanity. He estimates the risk of an “existential catastrophe” for our species over the next century is one in six.</p> <p class="paragraph_paragraph__3Hrfa">This is a fairly accurate and alarming figure. The complaint attracted <a target="_blank" class="Link_link__nE06W ScreenReaderOnly_srLinkHint__83_S_ Link_showVisited__gmCxW Link_showFocus__0kDeK" href="https://www.dailystar.co.uk/news/latest-news/humans-have-1-6-chance-21960557" rel="noopener">securities</a> at the time, and has been influential ever since – recently discussed by Australian politician Andrew Leigh in a <a target="_blank" class="Link_link__nE06W ScreenReaderOnly_srLinkHint__83_S_ Link_showVisited__gmCxW Link_showFocus__0kDeK" href="https://www.andrewleigh.com/what_s_the_worst_that_could_happen_existential_risk_and_extreme_politics_speech" rel="noopener">speech</a> in Melbourne.</p> <p class="paragraph_paragraph__3Hrfa">It’s hard to disagree with the idea that we face worrying prospects in the coming decades of climate change, nuclear weapons, and bioengineered pathogens (all major problems in my opinion ), malicious AI, and large asteroids (which I would consider less of a concern). ).</p> <p class="paragraph_paragraph__3Hrfa">But what about this number? Where does that come from? And what does that actually mean?</p> <h2 class="Typography_base__k7c9F Heading_heading__XLh_j Typography_sizeMobile20__zPuzG Typography_sizeDesktop32__a1adN Typography_lineHeightMobile24__xwyV0 Typography_lineHeightDesktop40__UHQxu Typography_marginBottomMobileSmall__8rIrY Typography_marginBottomDesktopSmall__IsBSx Typography_black__5rKXY Typography_colourInherit__xnbjy Typography_normalise__UWWOc">Coin Mintages and Weather Forecasts</h2> <p class="paragraph_paragraph__3Hrfa">To answer these questions, we must first answer another: what is probability?</p> <p class="paragraph_paragraph__3Hrfa">The most traditional view of probability is called frequentism and gets its name from its heritage in dice and card games. From this perspective, we know that there is a one in six chance that a fair die rolls a three (for example) by observing the frequency of threes in a large number of rolls.</p> <p class="paragraph_paragraph__3Hrfa">Or consider the more complicated case of weather forecasting. What does it mean when a metrologist tells us that there is a one in six (or 17%) chance that it will rain tomorrow?</p> <p class="paragraph_paragraph__3Hrfa">It is difficult to believe that the metrologist wants us to imagine a vast set of “tomorrows”, part of which will experience precipitation. Instead, we need to look at many of these predictions and see what happened after them.</p> <p class="paragraph_paragraph__3Hrfa">If the forecaster does his job well, we should see that when he says “one chance in six that it will rain tomorrow”, it actually rained the next day one in six times.</p> <p class="paragraph_paragraph__3Hrfa">Thus, traditional probability depends on observations and procedure. To calculate it, we need to have a set of repeated events on which to base our estimate.</p> <h2 class="Typography_base__k7c9F Heading_heading__XLh_j Typography_sizeMobile20__zPuzG Typography_sizeDesktop32__a1adN Typography_lineHeightMobile24__xwyV0 Typography_lineHeightDesktop40__UHQxu Typography_marginBottomMobileSmall__8rIrY Typography_marginBottomDesktopSmall__IsBSx Typography_black__5rKXY Typography_colourInherit__xnbjy Typography_normalise__UWWOc">Can we learn from the Moon?</h2> <p class="paragraph_paragraph__3Hrfa">So what does this mean for the likelihood of human extinction? Well, such an event would be unique: once it happened, there would be no room for a repeat.</p> <p class="paragraph_paragraph__3Hrfa">Instead, we might find parallel events that we can learn from. Indeed, in Ord’s book he discusses a number of potential extinction events, some of which can potentially be examined in light of a story.</p> <p> <!-- -->Counting craters on the Moon can give us clues about the risk of asteroid impacts on Earth. <span class="Typography_base__k7c9F VerticalArticleFigcaption_citation__mJIgi Typography_sizeMobile12__d1m0s Typography_lineHeightMobile24__xwyV0 Typography_regular__Aqp4p Typography_colourInherit__xnbjy Typography_letterSpacedSm__oprIk"><span class="Typography_base__k7c9F Typography_sizeMobile12__d1m0s Typography_lineHeightMobile20__akKiV Typography_regular__Aqp4p Typography_colourInherit__xnbjy Typography_letterSpacedSm__oprIk">(<span>Provided: NASA</span>)</span></span></p> <p class="paragraph_paragraph__3Hrfa">For example, we can estimate the chances of an extinction-sized asteroid hitting Earth by looking at how many such space rocks have hit the Moon over its history. A French scientist named Jean-Marc Salotti <a target="_blank" class="Link_link__nE06W ScreenReaderOnly_srLinkHint__83_S_ Link_showVisited__gmCxW Link_showFocus__0kDeK" href="https://www.sciencedirect.com/science/article/pii/S0016328722000337" rel="noopener">I did it in 2022</a>calculating the probability of extinction over the next century at around one in 300 million.</p> <p class="paragraph_paragraph__3Hrfa">Of course, such an estimate is fraught with uncertainty, but it relies on something approximating a proper frequency calculation. Ord, by contrast, estimates the risk of extinction from an asteroid at one in a million, while noting a considerable degree of uncertainty.</p> <h2 class="Typography_base__k7c9F Heading_heading__XLh_j Typography_sizeMobile20__zPuzG Typography_sizeDesktop32__a1adN Typography_lineHeightMobile24__xwyV0 Typography_lineHeightDesktop40__UHQxu Typography_marginBottomMobileSmall__8rIrY Typography_marginBottomDesktopSmall__IsBSx Typography_black__5rKXY Typography_colourInherit__xnbjy Typography_normalise__UWWOc">A results ranking system</h2> <p class="paragraph_paragraph__3Hrfa">There is another way of thinking about probability called Bayesianism, named after the English statistician Thomas Bayes. It focuses less on the events themselves and more on what we know, expect, and believe about them.</p> <p class="paragraph_paragraph__3Hrfa">In very simple terms, we can say that Bayesians view probabilities as a sort of ranking system. From this perspective, the specific number attached to a probability should not be taken directly, but rather compared to other probabilities to understand which outcomes are more or less likely.</p> <p class="paragraph_paragraph__3Hrfa">Ord’s book, for example, contains a table of potential extinction events and his personal estimates of their likelihood. From a Bayesian perspective, we can think of these values ​​as relative rankings. Ord thinks extinction due to asteroid impact (one in a million) is much less likely than extinction due to climate change (one in a thousand), and both are much less likely than extinction due to what he calls “unaligned artificial intelligence” (one in 10). ).</p> <p class="paragraph_paragraph__3Hrfa">The difficulty here is that initial estimates of Bayesian probabilities (often called “a priori”) are rather subjective (e.g. I would rank AI-based chances of extinction much lower). Traditional Bayesian reasoning moves from “priors” to “posteriors” by again incorporating observational evidence of relevant outcomes to “update” probability values.</p> <p class="paragraph_paragraph__3Hrfa">And once again, results relating to the probability of human extinction are rare.</p> <h2 class="Typography_base__k7c9F Heading_heading__XLh_j Typography_sizeMobile20__zPuzG Typography_sizeDesktop32__a1adN Typography_lineHeightMobile24__xwyV0 Typography_lineHeightDesktop40__UHQxu Typography_marginBottomMobileSmall__8rIrY Typography_marginBottomDesktopSmall__IsBSx Typography_black__5rKXY Typography_colourInherit__xnbjy Typography_normalise__UWWOc">Subjective estimates</h2> <p class="paragraph_paragraph__3Hrfa">There are two ways to think about the accuracy and usefulness of probability calculations: calibration and discrimination.</p> <p class="paragraph_paragraph__3Hrfa">Calibration is the accuracy of the actual values ​​of probabilities. We cannot determine this without appropriate observational information. Discrimination, on the other hand, simply refers to relative rankings.</p> <p class="paragraph_paragraph__3Hrfa">We have no basis to believe that the Ord values ​​are correctly calibrated. Of course, that’s probably not his intention. He himself indicates that they are for the most part designed to give “order of magnitude” indications.</p> <p class="paragraph_paragraph__3Hrfa">Even so, without any related observational confirmation, most of these estimates simply remain in the subjective realm of a priori probabilities.</p> <h2 class="Typography_base__k7c9F Heading_heading__XLh_j Typography_sizeMobile20__zPuzG Typography_sizeDesktop32__a1adN Typography_lineHeightMobile24__xwyV0 Typography_lineHeightDesktop40__UHQxu Typography_marginBottomMobileSmall__8rIrY Typography_marginBottomDesktopSmall__IsBSx Typography_black__5rKXY Typography_colourInherit__xnbjy Typography_normalise__UWWOc">Not well calibrated – but perhaps still useful</h2> <p class="paragraph_paragraph__3Hrfa">So what should we make of “one in six”? Experience suggests that most people have a less than perfect understanding of probability (as evidenced by, among other things, the continued volume of lottery ticket sales). In this environment, if you’re making an argument in public, an estimate of “probability” doesn’t necessarily need to be well-calibrated – it just needs to have the right kind of psychological impact.</p> <p class="paragraph_paragraph__3Hrfa">From that perspective, I’d say “one in six” does the trick. “One in 100” might seem small enough to ignore, while “one in three” might cause panic or be considered apocalyptic delusion.</p> <p class="paragraph_paragraph__3Hrfa">As someone concerned about the future, I hope that risks such as climate change and nuclear proliferation receive the attention they deserve. But as a data scientist, I hope that the careless use of probability will be left behind and replaced with widespread education about its true meaning and proper use.</p> <p class="paragraph_paragraph__3Hrfa"><strong>Steven Stern is a professor of data science at Bond University. This piece first appeared on <a target="_blank" class="Link_link__nE06W ScreenReaderOnly_srLinkHint__83_S_ Link_showVisited__gmCxW Link_showFocus__0kDeK" href="https://theconversation.com/is-there-really-a-1-in-6-chance-of-human-extinction-this-century-215054" rel="noopener">The conversation</a>.</strong></p> </div> <p><a href="https://whatsnew2day.com/is-the-possibility-of-human-extinction-this-century-truly-1-in-6/">Is the possibility of human extinction this century truly 1 in 6?</a></p><!-- /wp:html -->

WhatsNew2Day – Latest News And Breaking Headlines

In 2020, Oxford-based philosopher Toby Ord published a book called The precipice on the risk of extinction of humanity. He estimates the risk of an “existential catastrophe” for our species over the next century is one in six.

This is a fairly accurate and alarming figure. The complaint attracted securities at the time, and has been influential ever since – recently discussed by Australian politician Andrew Leigh in a speech in Melbourne.

It’s hard to disagree with the idea that we face worrying prospects in the coming decades of climate change, nuclear weapons, and bioengineered pathogens (all major problems in my opinion ), malicious AI, and large asteroids (which I would consider less of a concern). ).

But what about this number? Where does that come from? And what does that actually mean?

Coin Mintages and Weather Forecasts

To answer these questions, we must first answer another: what is probability?

The most traditional view of probability is called frequentism and gets its name from its heritage in dice and card games. From this perspective, we know that there is a one in six chance that a fair die rolls a three (for example) by observing the frequency of threes in a large number of rolls.

Or consider the more complicated case of weather forecasting. What does it mean when a metrologist tells us that there is a one in six (or 17%) chance that it will rain tomorrow?

It is difficult to believe that the metrologist wants us to imagine a vast set of “tomorrows”, part of which will experience precipitation. Instead, we need to look at many of these predictions and see what happened after them.

If the forecaster does his job well, we should see that when he says “one chance in six that it will rain tomorrow”, it actually rained the next day one in six times.

Thus, traditional probability depends on observations and procedure. To calculate it, we need to have a set of repeated events on which to base our estimate.

Can we learn from the Moon?

So what does this mean for the likelihood of human extinction? Well, such an event would be unique: once it happened, there would be no room for a repeat.

Instead, we might find parallel events that we can learn from. Indeed, in Ord’s book he discusses a number of potential extinction events, some of which can potentially be examined in light of a story.

Counting craters on the Moon can give us clues about the risk of asteroid impacts on Earth. (Provided: NASA)

For example, we can estimate the chances of an extinction-sized asteroid hitting Earth by looking at how many such space rocks have hit the Moon over its history. A French scientist named Jean-Marc Salotti I did it in 2022calculating the probability of extinction over the next century at around one in 300 million.

Of course, such an estimate is fraught with uncertainty, but it relies on something approximating a proper frequency calculation. Ord, by contrast, estimates the risk of extinction from an asteroid at one in a million, while noting a considerable degree of uncertainty.

A results ranking system

There is another way of thinking about probability called Bayesianism, named after the English statistician Thomas Bayes. It focuses less on the events themselves and more on what we know, expect, and believe about them.

In very simple terms, we can say that Bayesians view probabilities as a sort of ranking system. From this perspective, the specific number attached to a probability should not be taken directly, but rather compared to other probabilities to understand which outcomes are more or less likely.

Ord’s book, for example, contains a table of potential extinction events and his personal estimates of their likelihood. From a Bayesian perspective, we can think of these values ​​as relative rankings. Ord thinks extinction due to asteroid impact (one in a million) is much less likely than extinction due to climate change (one in a thousand), and both are much less likely than extinction due to what he calls “unaligned artificial intelligence” (one in 10). ).

The difficulty here is that initial estimates of Bayesian probabilities (often called “a priori”) are rather subjective (e.g. I would rank AI-based chances of extinction much lower). Traditional Bayesian reasoning moves from “priors” to “posteriors” by again incorporating observational evidence of relevant outcomes to “update” probability values.

And once again, results relating to the probability of human extinction are rare.

Subjective estimates

There are two ways to think about the accuracy and usefulness of probability calculations: calibration and discrimination.

Calibration is the accuracy of the actual values ​​of probabilities. We cannot determine this without appropriate observational information. Discrimination, on the other hand, simply refers to relative rankings.

We have no basis to believe that the Ord values ​​are correctly calibrated. Of course, that’s probably not his intention. He himself indicates that they are for the most part designed to give “order of magnitude” indications.

Even so, without any related observational confirmation, most of these estimates simply remain in the subjective realm of a priori probabilities.

Not well calibrated – but perhaps still useful

So what should we make of “one in six”? Experience suggests that most people have a less than perfect understanding of probability (as evidenced by, among other things, the continued volume of lottery ticket sales). In this environment, if you’re making an argument in public, an estimate of “probability” doesn’t necessarily need to be well-calibrated – it just needs to have the right kind of psychological impact.

From that perspective, I’d say “one in six” does the trick. “One in 100” might seem small enough to ignore, while “one in three” might cause panic or be considered apocalyptic delusion.

As someone concerned about the future, I hope that risks such as climate change and nuclear proliferation receive the attention they deserve. But as a data scientist, I hope that the careless use of probability will be left behind and replaced with widespread education about its true meaning and proper use.

Steven Stern is a professor of data science at Bond University. This piece first appeared on The conversation.

Is the possibility of human extinction this century truly 1 in 6?

By