Stories
Slash Boxes
Comments

SoylentNews is people

posted by janrinok on Sunday April 30, @08:13AM   Printer-friendly
from the caught-beneath-the-landslide-in-a-champagne-supernova dept.

An exploded star can pose more risks to nearby planets than previously thought:

This newly identified threat involves a phase of intense X-rays that can damage the atmospheres of planets up to 160 light-years away.

[...] Earth is not in danger of such a threat today because there are no potential supernova progenitors within this distance, but it may have experienced this kind of X-ray exposure in the past, scientists say.

Before this study, most research on the effects of supernova explosions focused on the danger from two periods: the intense radiation produced by a supernova in the days and months after the explosion, and the energetic particles that arrive hundreds to thousands of years afterward.

However, even these alarming threats do not fully catalog the dangers in the wake of an exploded star. Researchers have discovered that between these two previously identified dangers lurks another. The aftermaths of supernovae always produce X-rays, but if the supernova's blast wave strikes dense surrounding gas, it can produce a particularly large dose of X-rays that arrives months to years after the explosion and may last for decades.

[...] "The Earth is not in any danger from an event like this now because there are no potential supernovae within the X-ray danger zone," said Illinois undergraduate student Connor O'Mahoney, a co-author of the study. "However, it may be the case that such events played a role in Earth's past."

There is strong evidence – including the detection in different locations around the globe of a radioactive type of iron – that supernovae occurred close to Earth between about two and eight million years ago. Researchers estimate these supernovae were between about 65 and 500 light-years away from Earth.

[...] The study reports that although the Earth and the solar system are currently in a safe space in terms of potential supernova explosions, many other planets in the Milky Way are not. These high-energy events would effectively shrink the areas within the Milky Way galaxy, known as the Galactic Habitable Zone, where conditions would be conducive for life.

Journal Reference: Ian R. Brunton et al 2023 ApJ 947 42 doi: 10.3847/1538-4357/acc728 [open]


Original Submission

This discussion was created by janrinok (52) for logged-in users only. Log in and try again!
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
(1)
  • (Score: 2) by hendrikboom on Sunday April 30, @11:31AM (1 child)

    by hendrikboom (1125) on Sunday April 30, @11:31AM (#1304038) Homepage Journal

    So the supernovae two to eight million years ago did not exterminate life on Earth. What effect did they have besides creating some iron isotopes? Did they shrink the habitable zone?

  • (Score: 2) by maxwell demon on Sunday April 30, @05:35PM

    by maxwell demon (1608) Subscriber Badge on Sunday April 30, @05:35PM (#1304074) Journal

    I think the summary didn't state often enough that Earth is not currently in danger from an event like this. :-)

    --
    The Tao of math: The numbers you can count are not the real numbers.
  • (Score: 1) by Runaway1956 on Sunday April 30, @11:37PM (1 child)

    by Runaway1956 (2926) Subscriber Badge on Sunday April 30, @11:37PM (#1304103) Homepage Journal

    This tells me that if the galaxy has any intelligent species, you'll find them migrating around the outer rim of the galaxy, where they can avoid the effects of supernovas. Unless they have mastered the Dyson Sphere, making them safe in the crowded interior of the galaxy.

    --
    Abortion is the number one killed of children in the United States.
    • (Score: 2, Interesting) by dalek on Monday May 01, @03:51AM

      by dalek (15489) on Monday May 01, @03:51AM (#1304138) Journal

      The flaw in what you're saying is that the galactic habitable zone is also based on metallicity. Although supernovae are hazardous, they're also necessary for life because they're a source of heavier elements that are necessary for life. It would be unlikely to find life around a first generation star because of the lack of heavier elements. Such a star could certainly have planets, but I would expect them to be gas giants like Jupiter and have abundant hydrogen. The outer rim of the galaxy has a low metallicity, which limits the potential for life in that region.

      However, I also think the concept of a galactic habitable zone is somewhat flawed. It's based on the assumption that life would have a difficult time coping with the effects of frequent supernovae. I'm not convinced. There have been at least six mass extinctions in Earth's history: 1) the Great Oxygenation Event, 2) the Ordovician-Silurian extinction, 3) the Late Devonian extinction, 4) the Permian-Triassic extinction, 5) the Triassic-Jurassic extinction, and 6) the Cretaceous-Paleogene extinction. The Great Oxygenation Event not only resulted in life that could tolerate oxygen, which was poisonous to life that existed at that time, but also resulted in the evolution of life that requires oxygen for its survival. The Permian-Triassic extinction is the closest that Earth has come in at least the past half billion years to once again being devoid of life. Still, life that could adapt to the extremely harsh conditions of the early Triassic was able to survive.

      My point is that if frequent supernovae occur in the vicinity of a planet does not necessarily imply that life cannot exist on that planet. Instead, it implies that if life exists on that planet, it must have evolved to withstand the stresses caused by the supernovae. Life on Earth is incredibly resilient, which is why despite the extreme conditions of the Permian-Triassic extinction, the planet did not revert to a lifeless state.

      The Quaternary Period, which we live in, is marked by long periods of extreme glaciation punctuated by rapidly warming into brief interludes of relatively mild conditions. We are currently in one of those interludes, following a rapid warming out of the Last Glacial Maximum. Observers on another planet might note the significant climate fluctuations that we experience and assume that they occur too frequently and are too extreme, that they should cause mass extinctions and prevent Earth from sustaining life. Instead, the life we see has adapted to live in the cold conditions and cope with the oscillations between periods of deep glaciation with brief warming events between them. Similarly, I don't think we should be so quick as to presume that life could not evolve to cope with the effects of frequent supernovae, either.

      --
      Great minds discuss ideas; average minds discuss events; small minds discuss people; the smallest just whinge about SN.
  • (Score: 2, Interesting) by dalek on Monday May 01, @04:41AM

    by dalek (15489) on Monday May 01, @04:41AM (#1304143) Journal

    Gamma rays and x-rays are in two different parts of the electromagnetic spectrum. The mechanism for producing gamma ray bursts in a supernova is very different from the mechanism discussed in the article for generating x-rays. These are not equivalent mechanisms at all.

    However, both gamma ray bursts and the x-ray emissions discussed in the article are theorized to affect inhabited planets in similar ways. Specifically, they are both expected to deplete ozone, leading to an increase in ultraviolet radiation that is ultimately harmful to life, and also to produce nitrogen dioxide. Is there a reason to think that this particular mechanism would be more harmful to life than a gamma ray burst would be?

    The effects seem nearly identical, if I'm understanding them correctly. There is at least some understanding of the risk from gamma ray bursts. In my reply to Runaway's comment [soylentnews.org], I listed six mass extinction events in Earth's history. Some of these are far enough in the past that many of the rocks and much of the fossil record is gone, making it difficult to determine the cause of the event. However, one theory for the Ordovician-Silurian mass extinction is a gamma ray burst, and it does seem to explain some aspects of the event. Also, there is possible evidence of damage to life from ultraviolet radiation during the Late Devonian extinction, suggesting a similar mechanism. The article clearly describes a different mechanism, but is there a significant difference in the risk to life from this mechanism than in a gamma ray burst? It says that this should reduce the size of the galactic habitable zone, but I thought it was already fairly well understood that supernovae can be harmful to life, and that it would make life more difficult near the galactic center. Does this change our understanding of that in some way?

    --
    Great minds discuss ideas; average minds discuss events; small minds discuss people; the smallest just whinge about SN.
(1)