1. Log in now to remove adverts - no adverts at all to registered members!

The science behind RHCs liver thread

Discussion in 'Liverpool' started by Prince Knut, Apr 30, 2016.

  1. Prince Knut

    Prince Knut GC Thread Terminator

    Joined:
    May 23, 2011
    Messages:
    25,525
    Likes Received:
    12,875
    Merging black holes may create bubbles that could swallow the universe
    The area between a pair of large black holes on the verge of colliding could provide the conditions to create dangerous bubbles of "true vacuum"

    SPACE 26 November 2021
    By Leah Crane



    please log in to view this image

    Artistic impression of two black holes spiralling towards each other

    ESA



    Large colliding black holes could be a breeding ground for tiny black holes. If we spot signs of these cosmic lightweights, it could provide proof of the fundamental nature of our universe.

    There have been hints in particle physics that our universe may not be in the lowest possible energy state – instead of a true vacuum, it may be in a state called a false or metastable vacuum. If any part of the universe were to collapse into a true vacuum, the laws of physics as we know them would collapse inside that bubble of vacuum, which would expand at the speed of light and eventually swallow up everything.

    Some research has suggested that the extreme gravity near a black hole could create a foam of small bubbles of true vacuum. If those bubbles immediately fell into the black hole, though, that process could occur without destroying the universe.


    Rostislav Konoplich at Manhattan College in New York and his colleagues calculated what might happen if these vacuum bubbles formed in the region between two colliding black holes. “In the region between the colliding horizons of the black holes, you have gravitational pull from both sides balancing out, so maybe for a short time interval the bubble can exist sandwiched between the two black holes,” says Konoplich.

    The surface of each vacuum bubble would be expected to form a kind of film similar to a regular soap bubble. Given even a small amount of time to percolate between a pair of black holes, the bubbles could be expected to collide with one another. The researchers calculated that if multiple bubbles collided at once, the intersecting surface could become infinitely dense, forming a micro-black hole.

    Read more: The universe may be full of enormous clusters of tiny black holes
    Because of a process called Hawking radiation, these tiny black holes would emit a random mix of particles and evaporate away extremely quickly. Konoplich and his colleagues calculated that this entire process could take place in just about 10 milliseconds before the larger black holes collided and devoured any bubbles or micro-black holes in their way.


    But if bubbles of true vacuum do exist, it isn’t necessarily a certainty that the bubbles will safely fall into the huge black holes that enable their formation, says Ruth Gregory at King’s College London. “We know that these bubbles, once they’re formed, start to expand quite quickly and rapidly reach the speed of light,” says Gregory. “If they’re outside the horizon, it might be that they would expand instead of falling in.”

    Take our expert-led online cosmology course revealing the biggest mysteries in the universe
    This would be a disaster of apocalyptic proportions. “If one of these bubbles of true vacuum escaped, it would destroy the universe – oops,” says Gregory. The fact that the universe is still around suggests that bubbles of true vacuum are rare, if they exist at all, she says.

    However, if they do exist and form micro-black holes, we could detect the random radiation from their eventual evaporation. “If we can detect something like this, it would be very important because it would prove that our universe is metastable from an observational result, not just theoretical,” says Konoplich. That would be a major insight into the fundamental nature of our universe, which theoretical physicists are still debating.

    Reference: arxiv.org/abs/2111.07178

    Sign up to our free Launchpad newsletter for a voyage across the galaxy and beyond, every Friday
     
    #841
  2. Red Hadron Collider

    Red Hadron Collider The Hammerhead

    Joined:
    Mar 2, 2011
    Messages:
    57,485
    Likes Received:
    9,843
    Good read
     
    #842
  3. Prince Knut

    Prince Knut GC Thread Terminator

    Joined:
    May 23, 2011
    Messages:
    25,525
    Likes Received:
    12,875
    Wasn't it Hawking who theorised that black holes eventually link up and swallow the whole universe, then each other - then spew it all out again to create another Big Bang? (But who knows in which dimensions, and under what laws of physics).
     
    #843
  4. Prince Knut

    Prince Knut GC Thread Terminator

    Joined:
    May 23, 2011
    Messages:
    25,525
    Likes Received:
    12,875
    When did paedophiles and nonces become the ultimate test of shroud-waving, virtue-signalling? :huh:
     
    #844
  5. Prince Knut

    Prince Knut GC Thread Terminator

    Joined:
    May 23, 2011
    Messages:
    25,525
    Likes Received:
    12,875
    Is the universe conscious? It seems impossible until you do the maths


    The question of how the brain gives rise to subjective experience is the hardest of all. Mathematicians think they can help, but their first attempts have thrown up some eye-popping conclusions



    SPACE 29 April 2020 , updated 4 May 2020
    By Michael Brooks

    please log in to view this image

    Darren Hopes

    THEY call it the “unreasonable effectiveness of mathematics”. Physicist Eugene Wigner coined the phrase in the 1960s to encapsulate the curious fact that merely by manipulating numbers we can describe and predict all manner of natural phenomena with astonishing clarity, from the movements of planets and the strange behaviour of fundamental particles to the consequences of a collision between two black holes billions of light years away. Now, some are wondering if maths can succeed where all else has failed, unravelling whatever it is that allows us to contemplate the laws of nature in the first place.

    It is a big ask. The question of how matter gives rise to felt experience is one of the most vexing problems we know of. And sure enough, the first fleshed-out mathematical model of consciousness has generated huge debate about whether it can tell us anything sensible. But as mathematicians work to hone and extend their tools for peering deep inside ourselves, they are confronting some eye-popping conclusions.

    Not least, what they are uncovering seems to suggest that if we are to achieve a precise description of consciousness, we may have to ditch our intuitions and accept that all kinds of inanimate matter could be conscious – maybe even the universe as a whole. “This could be the beginning of a scientific revolution,” says Johannes Kleiner, a mathematician at the Munich Centre for Mathematical Philosophy in Germany.


    If so, it has been a long time coming. Philosophers have pondered the nature of consciousness for a couple of thousand years, largely to no avail. Then, half a century ago, biologists got involved. They have discovered correlations between the activity of brain cells and individual instances of experience, known as qualia. But the harsh truth is that neuroscience has brought us no closer to answering the question of how neurons give rise to joy or anger, or to the smell of coffee.

    This is what philosopher David Chalmers termed the “hard problem” of consciousness. Its unique difficulty stems from the inherently subjective nature of felt experience . Whatever it is, it isn’t something you can prod and measure. One philosopher called consciousness the “ghost in the machine”, and some people think we may never exorcise it.

    But, as Wigner pointed out, maths has a track record with hard problems. That is down to its ability to translate concepts into formal, logical statements that can draw out insights that wouldn’t be exposed from just talking about things in messy human language. “This might help us to quantify experiences like the smell of coffee in ways that we can’t if we rely on plain English,” says Kleiner.


    This is why he and Sean Tull, a mathematician at the University of Oxford, have begun formalising the mathematics behind the first and arguably only theory of consciousness with a halfway-thought-through mathematical underpinning (see “Models of experience”). Integrated information theory, or IIT, was conceived more than a decade ago by Giulio Tononi, a neuroscientist at the University of Wisconsin. His basic idea was that a system’s consciousness arises from the way information moves between its subsystems.

    One way to think of these subsystems is as islands, each with their own population of neurons. The islands are connected by traffic flows of information. For consciousness to appear, Tononi argued, this information flow must be complex enough to make the islands interdependent. Changing the flow of information from one island should affect the state and output of another. In principle, this lets you put a number on the degree of consciousness: you could quantify it by measuring how much an island’s output relies on information flowing from other islands. This gives a sense of how well a system integrates information, a value called “phi”.

    If there is no dependence on a traffic flow between the islands, phi is zero and there is no consciousness. But if strangling or cutting off the connection makes a difference to the amount of information it integrates and outputs, then the phi of that group is above zero. The higher the phi, the more consciousness a system will display.

    Another key feature of IIT, known as the exclusion postulate, says that a group will explicitly display consciousness only when its phi is “maximal”. That is to say, its own degree of consciousness has to be bigger than the degree of consciousness you can ascribe to any of its individual parts, and simultaneously bigger than the degree of consciousness of any system of which it is a part. Any and all parts of the human brain might have a micro-consciousness, for example. But when one part has an increase in consciousness, such as when a person is brought out of anaesthesia, the micro-consciousnesses are lost. In IIT, only the system with the largest phi displays the consciousness we register as experience.

    The idea has won adherents since Tononi first proposed it. “Theoretically, it’s quite appealing,” says Daniel Bor at the University of Cambridge. “We have this association between consciousness and intelligence: creatures able to recognise themselves in the mirror also seem to be the most intelligent. So some connection between consciousness and intelligence seems reasonable.” And intelligence has a link to gathering and processing information. “That means you may as well make the related connection that in some way consciousness is related to information processing and integration,” Bor says.

    please log in to view this image

    Darren Hopes

    It also seems to make sense given some of what we know about consciousness in the human brain. It is compromised, for example, if there is damage to the cerebral cortex. This region has a relatively small number of highly interconnected neurons, and would have a large phi in IIT. The cerebellum, on the other hand, has a much higher number of neurons, but they are relatively unconnected. IIT would predict that damage to the cerebellum might have little effect on conscious experience, which is exactly what studies show.

    IIT is less convincing when it comes to some details, though. Phi should decrease when you go to sleep or are sedated via a general anaesthetic, for instance, but Pedro Mediano, now part of Bor’s lab at the University of Cambridge, and his colleagues have shown that it doesn’t. “It either goes up or stays the same,” says Bor. And explaining why information flow gives rise to an experience such as the smell of coffee is problematic. IIT frames conscious experience as the result of “conceptual structures” that are shaped by the arrangement of parts of the relevant network, but many find the explanation convoluted and unsatisfying.

    Philosopher John Searle is one of IIT’s detractors. He has argued that it ignores the question of why and how consciousness arises in favour of making the questionable assumption that it is simply a by-product of the existence of information. For that reason, he says, IIT “does not seem to be a serious scientific proposal”.

    Perhaps the most troubling critiques of IIT as a mathematical theory concern a lack of clarity about the underlying numbers. When it comes to actually calculating a value for phi for the entirety of a system as complex as a brain, IIT gives a recipe that is almost impossible to follow – something even Tononi admits.

    “As it’s currently given, phi is very difficult to calculate for a whole brain,” Tull says. That might be a bit of an understatement. Researchers have worked out that using the current method, calculating phi for the 86 billion neurons of the human brain would take longer than the age of the universe. Bor has worked out that just calculating it for the 302-neuron brain of a nematode worm would take 5 × 1079 years on a standard PC.

    And when you calculate phi for things you wouldn’t expect to be conscious, you get all sorts of strange results. Scott Aaronson, a theoretical physicist at the University of Texas at Austin, for example, was initially excited by the theory, which he describes as “a serious, honourable attempt” to figure out how to get common sense answers to the question of which physical systems are conscious. But then he set to testing it.

    Aaronson took the principles of IIT and used them to compute phi for a mathematical object called a Vandermonde matrix. This is a grid of numbers whose values are interrelated, and can be used to build a grid-like circuit, known as a Reed-Solomon decoding circuit, to correct errors in the information that is read off CDs and DVDs. What he found was that a sufficiently large Reed-Solomon circuit would have an enormous phi. Scaled to a large enough size, one of these circuits would end up being far more conscious than a human.

    The same problem exists in other arrangements of information processing routines, Aaronson pointed out: you can have integrated information, with a high phi value, that doesn’t lead to anything we would recognise as consciousness. He concluded that IIT unavoidably predicts vast amounts of consciousness in physical systems that no sane person would regard as particularly ‘conscious’ at all”.

    Aaronson walked away, but not everyone sees highly conscious grid-shaped circuits as a deal-breaker. For Kleiner, it is simply a consequence of the nature of the beast: we lack information because any analysis of consciousness relies on self-reporting and intuition. “We can’t get reports from grids,” he says. “This is the problem.”

    Rather than abandoning a promising model, he thinks we need to clarify and simplify the mathematics underlying it. That is why he and Tull set about trying to identify the necessary mathematical ingredients of IIT, splitting them into three parts. First is the set of physical systems that encode the information. Next is the various manifestations or “spaces” of conscious experience. Finally, there are basic building blocks that relate these two: the “repertoires” of cause and effect.

    In February, they posted a preprint paper demonstrating how these ingredients can be joined in a way that provides a logically consistent way of applying the IIT algorithm for finding phi. “Now the fundamental idea is well-defined enough to make the technical problems go away,” says Kleiner.

    Their aspiration is that mathematicians will now be able to create improved models of consciousness based on the premises of IIT – or, even better, competitor theories. “We would be glad to contribute to the further development of IIT, but we also hope to help improve and unite various existing models,” Kleiner says. “Eventually, we may come to propose new ones.”

    One consequence of this stimulus might be a reckoning for the notion, raised by IIT’s application to grid-shaped circuits, that inanimate matter can be conscious. Such a claim is typically dismissed out of hand, because it appears to be tantamount to “panpsychism”, a philosophical viewpoint that suggests consciousness is a fundamental property of all matter. But what if there is something in it?

    “Particles or other basic entities might have simple forms of consciousness that combine to make our own”

    To be clear, no one is saying that fundamental particles have feelings. But panpsychists do argue that they may have some semblance of consciousness, however fragmentary, that could combine to generate the various levels of consciousness experienced by birds or chimpanzees or us. “Particles or other basic physical entities might have simple forms of consciousness that are fundamental, but complex human and animal consciousness would be constituted by or emergent from this,” says Hedda Hassel Mørch at Inland Norway University of Applied Sciences in Lillehammer.

    The idea that electrons could have some form of consciousness might be hard to swallow, but panpsychists argue that it provides the only plausible approach to solving the hard problem. They reason that, rather than trying to account for consciousness in terms of non-conscious elements, we should instead ask how rudimentary forms of consciousness might come together to give rise to the complex experiences we have.

    With that in mind, Mørch thinks IIT is at least a good place to start. Its general approach, analysing our first-person perspective in terms of what we perceive when certain brain regions become active and using that to develop constraints on what its physical correlate could be, is “probably correct”, she says. And although IIT as currently formulated doesn’t strictly say everything is conscious – because consciousness arises in networks rather than individual components – it is entirely possible that a refined version could. “I think that the core ideas underlying IIT are fully compatible with panpsychism,” says Kleiner.



    please log in to view this image

    Can pure mathematics describe subjective experience?

    Eduard Muzhevskyi/Science Photo Library



    That might also fit in with indications from elsewhere that the relationship between our consciousness and the universe might not be as straightforward as we imagine. Take the quantum measurement problem. Quantum theory, our description of the basic interactions of matter, says that before we measure a quantum object, it can have many different values, encapsulated in a mathematical entity called the wave function. So what collapses the many possibilities into something definite and “real”? One viewpoint is that our consciousness does it, which would mean we live in what physicist John Wheeler called a “participatory universe”.

    “The universe’s consciousness might have been excluded by the evolution of our own”

    There are many problems with this idea, not least the question of what did the collapsing before conscious minds evolved. A viable mathematical model of consciousness that allows for it to be a property of matter would at least provide a solution for that.

    Then there’s University of Oxford mathematician Roger Penrose’s suggestion that our consciousness is actually “the reason the universe is here”. It is based on a hunch about quantum theory’s shortcomings. But if there is any substance to this idea, the framework of IIT – and its exclusion postulate in particular – suggests that information flow between the various scales of the universe’s contents could create different kinds of consciousness that ebb and flow depending on what exists at any particular time. The evolution of our consciousness might have, in IIT’s terms, “excluded” the consciousness of the universe.

    Or perhaps not. There are good reasons to remain sceptical about the power of maths to explain consciousness, never mind the knock-on effects for our understanding of physics. We seem to be dealing with something so involved that calculations may not even be possible, according to Phil Maguire, a computer scientist at Maynooth University in Ireland. “Breaking down cognitive processes is so complex that it is not feasible,” he says.

    Others express related doubts as to whether maths is up to the job, even in principle. “I think mathematics can help us understand the neural basis of consciousness in the brain, and perhaps even machine consciousness, but it will inevitably leave something out: the felt inner quality of experience,” says Susan Schneider, a philosopher and cognitive scientist at the University of Connecticut.

    Philip Goff, a philosopher at Durham University, UK, and a vocal advocate for panpsychism, has a similar view. Consciousness deals with physical phenomena in terms of their perceived qualities, he points out – the smell of coffee or the taste of mint, for example – which aren’t conveyable in a purely quantitative objective framework. “In dealing with consciousness, we need more than the standard scientific tools of public observation and mathematics,” Goff says.

    But Kleiner isn’t put off. He is developing a mathematical model that can incorporate ineffable, private experiences. It is currently undergoing peer review. And even if it doesn’t work, he says, something else will: “I’m fully convinced that in combination with experiments and philosophy, maths can help us proceed much further in uncovering the mystery of consciousness.”

    CONSCIOUSNESS
     
    #845
  6. Red Hadron Collider

    Red Hadron Collider The Hammerhead

    Joined:
    Mar 2, 2011
    Messages:
    57,485
    Likes Received:
    9,843
    That;s some heavy **** <yikes>
     
    #846
  7. Prince Knut

    Prince Knut GC Thread Terminator

    Joined:
    May 23, 2011
    Messages:
    25,525
    Likes Received:
    12,875
    Links an awful lot with that Penrose set of books you recommended I read a few years back, don't you think? :emoticon-0104-surpr
     
    #847
  8. Red Hadron Collider

    Red Hadron Collider The Hammerhead

    Joined:
    Mar 2, 2011
    Messages:
    57,485
    Likes Received:
    9,843
    In certain ways, yeah <ok>
     
    #848
  9. M!LK

    M!LK Well-Known Member

    Joined:
    Nov 12, 2021
    Messages:
    8,602
    Likes Received:
    6,561
    #849
  10. Prince Knut

    Prince Knut GC Thread Terminator

    Joined:
    May 23, 2011
    Messages:
    25,525
    Likes Received:
    12,875
    “I have a foreboding of an America in my children's or grandchildren's time -- when the United States is a service and information economy; when nearly all the manufacturing industries have slipped away to other countries; when awesome technological powers are in the hands of a very few, and no one representing the public interest can even grasp the issues; when the people have lost the ability to set their own agendas or knowledgeably question those in authority; when, clutching our crystals and nervously consulting our horoscopes, our critical faculties in decline, unable to distinguish between what feels good and what's true, we slide, almost without noticing, back into superstition and darkness...

    The dumbing down of American is most evident in the slow decay of substantive content in the enormously influential media, the 30 second sound bites (now down to 10 seconds or less), lowest common denominator programming, credulous presentations on pseudoscience and superstition, but especially a kind of celebration of ignorance”
    ― Carl Sagan, The Demon-Haunted World: Science as a Candle in the Dark
     
    #850
    organic red likes this.

  11. Prince Knut

    Prince Knut GC Thread Terminator

    Joined:
    May 23, 2011
    Messages:
    25,525
    Likes Received:
    12,875
    Two years of covid-19: What we’ve learned during the pandemic so far
    It's now been two years since Chinese authorities first informed the World Health Organization about an unknown virus in Wuhan. How has our understanding of the virus changed since then and where does that leave us?

    HEALTH 31 December 2021
    By Helen Thomson



    please log in to view this image

    A new hospital was rapidly built in Wuhan, China, in early 2020

    STR/AFP via Getty Images



    On 31 December 2019, Chinese authorities informed the World Health Organization (WHO) about a cluster of “viral pneumonia” cases of unknown cause in the city of Wuhan. Two years later, the coronavirus now known as SARS‑CoV-2 has resulted in at least 5.3 million deaths. As the world awaits the full impact of the new variant omicron, New Scientist looks back at the phenomenal scientific endeavour across the pandemic, and at how much we now know about the virus and how to fight it.

    Where did covid-19 come from?
    In March 2021, a group tasked by the WHO to investigate covid-19’s origins concluded that SARS-CoV-2 is most likely to be an animal virus that moved into humans through contact with an animal host, either at the Huanan Seafood Wholesale Market, a live animal market in Wuhan, or at another step in the trade of wildlife.

    The WHO group hedged its bets because the first person reported to have become ill with covid-19 on 8 December 2019 had no link with the market. A more recent analysis, however, suggests that this individual actually developed symptoms on 16 December, and only visited a hospital on 8 December for dental problems.


    This means the earliest known case may indeed have had ties to the market: a seafood vendor who became sick on 11 December. A third of the 168 people later identified as having had the virus in December 2019 had connections to the market.

    The mounting evidence for a market origin weakens the case for a lab leak, a premise that couldn’t be ruled out by an investigation commissioned by US president Joe Biden in 2021. Since these investigations, coronaviruses that are the closest match yet found to SARS‑CoV-2 have been discovered in bats in Laos, says Marion Koopmans at Erasmus University Medical Centre in the Netherlands, who was part of the WHO’s investigation team. Certain features of these wild viruses were the same as those that some researchers claimed could only have arisen during “gain of function” tests in a lab, in which an organism is genetically altered to enhance certain characteristics.

    A June 2021 paper in Scientific Reports adds further support to the market origins story. The authors were serendipitously surveying markets in the Wuhan area that were selling wild animals for food or pets between May 2017 and November 2019. They discovered many animal welfare and exploitation issues with “considerable implications for food hygiene”. The animals traded are capable of hosting a wide range of infectious diseases, they say.


    Some early covid‑19 cases were linked to an area of Huanan market where wild animals such as raccoon dogs were kept. These animals can be infected and display few symptoms, boosting the idea that animals in the market acted as an intermediate reservoir for the virus, says Koopmans.

    In response to covid-19, China temporarily prohibited all wildlife trade until the pandemic concludes and permanently banned the eating and trading of non-livestock animals for food.

    How does the coronavirus spread?
    Back in January 2020, researchers urgently needed to understand the nature of the virus and how it was spreading. On 3 January, Yong-Zhen Zhang at Fudan University in Shanghai, China, was given a box containing swabs from people with the mysterious pneumonia sweeping Wuhan. By 5 January, having worked two nights straight, Zhang’s team had sequenced the virus and identified it as a coronavirus. That same day, Zhang uploaded the genome to the US National Center for Biotechnology Information. By comparison, in 2003, scientists took two months to identify the cause of an international outbreak of a new disease, SARS, as a coronavirus.

    It soon became clear that SARS‑CoV-2 spread easily and could cause severe disease, particularly in older age groups or in those with underlying health issues. By the end of February, its death count had surpassed those caused by the coronaviruses responsible for the SARS outbreak and MERS, a disease that emerged in 2012.

    The WHO declared the covid-19 outbreak a pandemic on 11 March 2020. Working out how to minimise transmission was key, but from early on, there was disagreement among experts. At first, the focus was on surface transmission – infected people contaminating surfaces that were then touched by others. Swabs from hospitals found the virus lurking everywhere, from stethoscopes to reading glasses. Sales of hand sanitiser soared.

    Other researchers concentrated on transmission via large droplets spread as an infected person coughs or sneezes near others. Droplets are heavy and fall from the air within seconds, rarely travelling more than 2 metres.

    Social distancing and face coverings were widely implemented as a way to help prevent this type of spread. But rigorous evidence on the effectiveness of face coverings was slow to appear and often contested. The WHO initially only recommended them for people who were actively coughing or caring for those with covid-19.

    Today, we know that all face coverings help cut the risk of catching and transmitting the virus to a certain extent.

    Then, there is the issue of aerosols. These tiny particles hang in the air and so can travel further than 2 metres, but many researchers initially disregarded this route of spread. The WHO stated at a press conference on 27 March 2020 that “transmission of covid‑19 is through droplets, it is not airborne”.

    This is because doctors have traditionally assumed that respiratory diseases, like tuberculosis and influenza, are spread mainly by droplets – “coughs and sneezes spread diseases”, says Trisha Greenhalgh at the University of Oxford. “It’s a mindset that’s deeply ingrained in the infectious disease community.” But more recent research has shown that both TB and flu can be spread via aerosols, upending conventional wisdom.

    The tide began to turn in July 2020, when 239 scientists from 32 countries published evidence that SARS-CoV-2 was airborne, appealing to the WHO and others to acknowledge its impact. However, it wasn’t until May 2021 that the WHO and the US Centers for Disease Control and Prevention changed their guidance, stating that aerosols are the primary route for transmitting the virus, mainly between people in close proximity with each other, and typically 1 to 2 metres apart, or in poorly ventilated or crowded indoor environments.

    Subsequent research has shown that surface transmission is likely to be a factor in the spread of the virus, but not a primary means. Good ventilation is now seen as a vital control measure.

    All of this has left some scientists urging a paradigm shift in how we combat respiratory infection. In a call for action published in Science in May 2021, a group of more than 30 scientists and doctors pointed to the great disparity in the way in which we address different sources of environmental infection. While governments have long invested in food safety, sanitation and clean drinking water, the group argued that airborne infections haven’t been targeted strongly enough through changes to regulations, standards and building design that could help prevent their transmission.



    please log in to view this image

    An electron micrograph of the SARS-CoV-2 virus that causes covid-19

    SCIENCE PHOTO LIBRARY



    How has the virus evolved?
    As soon as the coronavirus started spreading, it also began to mutate, leading to new variants. “Omicron should not surprise anyone, it is what viruses do,” said Tedros Adhanom Ghebreyesus, director general of the WHO, at a press conference in December 2021. “It’s what this virus will continue to do as long as we allow it to continue to spread.”

    Each time a virus replicates, it has a chance of mutating. Some mutations make it better at moving through a population. The first new variant to spread widely was alpha, which was sequenced in September 2020 and is about 50 per cent more transmissible than earlier variants. It was first identified in the UK and research suggests that it may have evolved in someone with a weakened immune system. This meant they couldn’t wipe out the virus, encouraging it to evolve and mutate.

    Next came beta, which was first spotted in South Africa and was first sequenced in October 2020. Among its mutations is one that alters the shape of a key protein, helping it to evade antibodies that are effective against other variants. Recent work suggests it spread quickly because it is 20 per cent better than previous variants at evading the immune response in previously infected people.

    In late 2020, another variant, gamma, emerged and caused a surge of cases in Manaus, Brazil. Here, it was estimated that 75 per cent of the population had already been infected with SARS-CoV-2. The new variant had a mutation allowing the virus’s spike protein to bind more easily to cells, making it more infectious. This protein is the part of the virus that recognises host cells, and is the main target of our immune response. Another mutation helped it evade antibodies from past infections.

    Then delta swept the world. The variant was sequenced in October 2020 and first detected in India, where it caused a huge wave of infections. At least 50 per cent more transmissible than alpha, delta outcompeted all other variants over the course of 2021, becoming the most common one in the world. Vaccines are still effective against it, but are around 15 per cent worse at preventing infection by delta than by alpha.

    Omicron, which emerged in November 2021, has the highest number of mutations so far seen in the spike protein, and we don’t yet know their full impact. You can eyeball the mutations in order to work out what effect they might have, says Danny Altmann at Imperial College London, “but many are new”.

    Omicron spread rapidly in South Africa, where a large majority of the population has previously been infected but only about 25 per cent are fully vaccinated.

    By 18 December 2021, a total of 89 countries had detected the presence of omicron. The variant appears to spread much faster than others. A December study of data from South Africa suggested that omicron is 4.2 times more transmissible in its early stages than delta, and there is some evidence that it may multiply in our airways 70 times faster.

    The variant also seems to exhibit “immune escape”, to some extent evading the immune responses of people who have already had covid-19 or been vaccinated. Lab studies by Pfizer suggest that three doses of the vaccine it developed with BioNTech offer significant protection against infection from omicron, but two doses don’t.

    Uğur Şahin, CEO of BioNTech, said in a press statement that a component of our immune system, called memory T-cells, generated by the vaccine, may prevent severe disease in those who haven’t had three shots.

    The variant’s ability to infect the double-vaccinated prompted the UK to open up its booster programme to all adults in December. Infection numbers in the UK have since hit record-breaking highs, but there was some good news in the run-up to Christmas, as preliminary analyses of data from the UK suggested that infection with omicron may be 20 to 70 per cent less likely to result in a hospital visit. In people who haven’t yet caught covid-19 or been vaccinated, hospitalisation with omicron appears to be about 11 per cent less likely than with delta. However, this is unlikely to be enough to counteract the variant’s extreme transmissibility, and health systems worldwide are bracing for surges in hospital admissions.

    How good are the vaccines?
    The major success story of the pandemic was how fast vaccines were created. Thanks to years of research following the SARS and MERS outbreaks, researchers had a good idea of what aspects of SARS‑CoV-2 to target. The pandemic also coincided with the maturation of mRNA vaccine technology.

    Traditional vaccines tend to contain weakened or inactivated virus that the body learns to recognise so it is ready to fight the virus when next encountered. The new Pfizer/BioNTech and Moderna vaccines introduce an mRNA sequence that tells the body to make a harmless part of the coronavirus’s spike protein, which triggers an immune response. These vaccines can be developed faster and more cheaply than traditional ones.

    For vaccines of all types, money was pumped into trials so multiple studies could be run at the same time, and cash was given to manufacturers to increase production capacities.

    We now have 23 covid-19 vaccines in use, and around 135 others in various stages of human trials. There have, of course, been hurdles. The Oxford/AstraZeneca jab was linked to rare blood clotting events, which led to some countries restricting its use.

    Still, the vaccine programme has worked so well in high-income nations that covid‑19 was referred to as a disease of the unvaccinated by Andrew Pollard, director of the Oxford Vaccine Group at the University of Oxford. He wrote in The Guardian in November 2021 that the “ongoing horror” of people with covid-19 fighting for breath in intensive care units across Britain “is now largely restricted to unvaccinated people”.

    While we wait to see omicron’s impact on hospitalisations and deaths, the good news is that if it, or any other variant, undermines the current vaccine programme, scientists are prepared. Pfizer CEO Albert Bourla has said his company could make an updated vaccine in less than 100 days. Others are working on variant-specific and multi-variant vaccines.



    please log in to view this image

    A queue for booster shots in London in December

    Rob Pinney/Getty Images



    What treatments do we have?
    Vaccines aren’t our only tool against the virus. Steroids, including dexamethasone, the first drug proven to save lives from covid-19, have been used by medics from the beginning of the pandemic. Doctors reasoned that steroids would help reduce the impact of severe disease by preventing the immune system from going into overdrive and damaging organs. That turned out to be true – a discovery that was unprecedented in its speed, thanks to collaboration across seven clinical trials in 12 countries, coordinated by the WHO.

    Three monoclonal antibodies, which are manufactured versions of antibodies that attach to the virus’s spike protein and make it harder for it to enter human cells, have been given emergency approval by the US Food and Drug Administration (FDA).

    The drugs showed promise in reducing hospitalisation in infected people at high risk of more severe disease. They also decreased the spread of disease to other people in the household when taken prophylactically. However, there are suggestions from recent data that some monoclonal antibody drugs may not be effective against omicron.

    Monoclonal antibodies are also expensive and hard to give outside a hospital setting. Oral antivirals that can be taken at home may be a better option. One, a drug made by Pfizer called Paxlovid, has shown very promising results. When taken for five days shortly after symptoms start, the drug cut hospital admissions by 89 per cent in adults at high risk of severe illness. The drug appears to work well against omicron, and was given emergency approval by the FDA on 22 December. President Biden has already ordered enough pills to treat 10 million people.

    Another antiviral, molnupiravir from Merck, appears to reduce the risk of hospitalisation or death by about 30 per cent in at-risk people with mild to moderate covid-19. The UK approved this drug in November 2021.

    Other treatments are in human trials. For instance, the cheap oral antidepressant fluvoxamine has shown strong evidence of preventing covid-19 progressing from a mild case to a severe one in those at serious risk.

    Read more: UK has begun using drugs for covid-19 cases before they become severe
    What do we still not know?
    Two years on, several key questions about the virus are yet to be resolved, including the virus’s origins. Although evidence suggests it began in a market in China and that it derives from a bat coronavirus, it isn’t clear how it spread to humans.

    We don’t know the dose of SARS-CoV-2 needed to transmit infection. To work this out, several human challenge trials are under way, in which volunteers are given varying viral doses in controlled conditions.

    We also need to identify the level of antibodies needed to prevent infection, which is helpful for assessing how effective vaccines are and also for rapidly deciding whether they need to be changed. Researchers met in December 2021 to discuss data on antibodies for all the variants of concern, to reach an agreement on what antibody levels are required to protect people against severe disease. Results are forthcoming.

    We don’t know what future variants might be called, once we have run out of Greek letters. The World Health Organization is considering using lesser-known constellations next, says Maria Van Kerkhove at the WHO.

    And finally, we don’t know how dangerous future variants may be.

    More on these topics:

     
    #851
    saintKlopp likes this.
  12. Prince Knut

    Prince Knut GC Thread Terminator

    Joined:
    May 23, 2011
    Messages:
    25,525
    Likes Received:
    12,875
    What the thermodynamics of clocks tell us about the mysteries of time
    Surprising new insights about the strange physics underlying how clocks work could transform our understanding of time's arrow – and hint at how time works at the quantum scale

    PHYSICS 29 December 2021
    By Miriam Frankel

    please log in to view this image

    Hardziej Studio

    A CENTURY ago, two intellectual giants met to debate the nature of time. One was the French philosopher Henri Bergson, a superstar whose fans caused the first traffic jam on Broadway in New York as they flocked to one of his earlier appearances. He believed there was more to time than something that can be measured by clocks, captured by mathematics or explained by psychology. He argued that the way we experience it, with a duration and direction, could only be revealed through philosophy.

    Bergson’s opponent, a physicist called Albert Einstein, disagreed. After developing his theories of relativity he believed time was a physical entity, separate from human consciousness, that could speed up or slow down. Einstein thought that time was interwoven in space in a static cosmos called the block universe which lacks a clear past, present or future.

    Almost 100 years later, the question of why the time we perceive is so different from the time postulated in physics is still hotly debated. Now, fresh clues are starting to suggest the devices we use to measure time might be crucial to arriving at an answer.


    Those clues relate to the fact that in general relativity, clocks are incorporated into the theory as perfectly idealised objects, with smooth readings that are accurate no matter how much you zoom in, when they actually are anything but. “Clocks are physical things which are made up of physical systems, and so we kind of know that idealisation can’t be right,” says Emily Adlam at the Rotman Institute of Philosophy at Western University in Canada. “A more realistic understanding of clocks may ultimately be the key to understanding time.”

    We can measure time using anything that goes through a change – sundials use the shifting sun, water clocks tap the flow of water and even the temperature of a cup of tea can help us estimate when it was brewed. Today, we mostly use sophisticated mechanical and atomic clocks, which can measure time much more accurately than a cup of tea, because they tick reliably with a certain frequency.

    Since astronomer Christiaan Huygens invented the first pendulum clock in the 17th century, we have been steadily improving the accuracy of scientific clocks, with phenomenal results. Nowadays, the most impressive machines can measure each second so accurately that they wouldn’t miss a beat in 20 billion years, longer than the age of the universe. Impressive. But it turns out there may be a price to pay for such accuracy.


    To produce their ticks, clocks need a source of energy. A grandfather clock must be wound up and a wall clock is powered by a battery. The most accurate atomic clocks, with ticks that correspond to electromagnetic signals given off by atoms changing energy levels, are driven by high-powered lasers.

    This isn’t surprising. But rather than just requiring energy to run each mechanical part, new research suggests something more might be at play. Clocks could be a type of so-called thermodynamic machine, with fundamental constraints on their performance set by the underlying physics. If this is true, not only will it mean there could be a limit to how accurately we can measure time’s passing, it “will have a huge impact on how philosophers think about time”, says Gerard Milburn, a quantum physicist at the University of Queensland, Australia.

    We know of two types of thermodynamic machine. The first comprises heat engines – things like fridges and combustion engines – which have a maximum efficiency set by thermodynamics. The second group encompasses information storage devices, like DNA and hard discs. In these, thermodynamics tells us the cost of erasing information. If clocks are a third, it would mean there are limits on how accurately we can tell the time, due to the involvement of energy’s messy cousin, entropy.

    The maximum efficiency of heat engines was determined by engineer Sadi Carnot in 1824, before entropy was defined. But his calculation paved the way for the discovery of the second law of thermodynamics, which says any closed system – something that nothing can enter or leave – will increase in entropy, a measure of disorder or randomness, over time.

    Low entropy means high order. If the atoms in a box of gas clustered in one corner rather than being spread out chaotically, entropy would be low. But because there are fewer ways for atoms to be ordered than disordered, making the latter more likely, closed systems – like the universe – tend towards disorder. A cup of hot tea loses heat to its surroundings, raising overall entropy, but never spontaneously heats up. This creates an arrow of time.

    The second law is the only law of physics in which rules are irreversible in time. Because of this, thermodynamics is used to explain the arrow of time we perceive. But the second law doesn’t tell the whole picture. There is still a question of why we only ever experience time moving forwards – many physicists today argue that this is simply an illusion.

    If the arrow of time in thermodynamics could be linked to the practical reality of measuring time, then thermodynamics could help explain how we perceive time after all, says Adlam. What is needed, she says, is a direct link between thermodynamics and practical timekeeping – something explaining why all clocks run in the same direction as the entropy increase of the universe. Find this link, and we might just answer some of the questions Einstein and Bergson were at odds over. In search of this connection, a handful of researchers are turning to clocks.



    please log in to view this image

    The efficiency of engines is governed by thermodynamic laws

    Alpegor/Alamy



    A few years ago, Paul Erker at the Institute for Quantum Optics and Quantum Information in Vienna, Austria, teamed up with Marcus Huber at the Vienna University of Technology in an attempt to understand what clocks really are. They started off by modelling quantum clocks, simple systems in which the flow of energy is easy to track. In a 2017 paper, they and their colleagues showed a clock made of just three atoms – one hot, one cold and one “ticking” thanks to the energy flow between the two others – should dissipate more energy the more accurate it is. This was a big step, but still purely theoretical. By 2020, they were ready to test it.

    “The way we perceive time may be physically built into the process of timekeeping”

    Teaming up with researchers including Natalia Ares at the University of Oxford and Edward Laird at Lancaster University, both in the UK, the researchers built a simple pendulum clock from a suspended membrane of silicon nitride with a thickness of about 50 nanometres. “You could think of it more like a drum than a pendulum,” says Laird. They made their tiny drum vibrate, with each vibration corresponding to one tick of the clock. The strength of the vibrations could be increased by applying an electric field. To determine the clock’s accuracy – how regularly the ticks occurred – they connected it to an electrical circuit including a voltmeter. “It is a beautiful experiment,” says Milburn.

    The crux of that experiment was that the clock became more accurate as more energy was supplied to the drum. And the more accurate it was, the more entropy it produced. This was the first result to explain why clocks move forwards in time, because as they measure time, they increase entropy, an irreversible process. “This research gives a very nice explicit link between the thermodynamic arrow of time and perceptual time,” says Adlam.

    Carlo Rovelli at Aix-Marseille University in France agrees the work sharpens our understanding of the strict relationship between time and heat. “Simply put, if there is no heat involved, there is no way to distinguish the past from the future,” he says. The research strengthens his thermal time hypothesis, which argues that time emerges from the laws of thermodynamics on the macroscopic scale of humans, regardless of what is going on at a microscopic level.

    Crucially, the research also shows that the arrow of time isn’t something only humans can experience. “It doesn’t really matter if it’s a conscious agent who observes the clock or a device, such as a detector,” says Huber. The entropy still increases. “It’s true for anything.” Rather than being a consequence of our consciousness, this suggests the way we perceive time may be physically built into the process of timekeeping. If so, Bergson’s argument falls apart and Einstein looks right to have believed time is a physical entity.

    This isn’t the first time a link between energy cost and the accuracy of clocks has been explored. A similar relationship between accuracy and energy cost has been seen in the biochemical clocks that operate inside ocean-dwelling cyanobacteria, helping them generate the chemicals needed for photosynthesis early in the morning before the sun rises. This is partly because they are living organisms, not mechanical clocks. “Evolution probably places additional constraints on what it means for a clock to be good, beyond the energetic constraints of precision,” says Jordan Horowitz at the University of Michigan.

    But not all clocks entirely follow the rules, it would seem. The most accurate atomic clocks appear more efficient than the research predicts. These clocks involve complex circuits, detectors and feedback, making their energy flow difficult to model. Both Erker and Huber are confident they will be shown to obey the same constraint. “I’m not able to prove this statement yet,” says Erker. “But my hunch definitely goes in this direction.”

    If he’s right, it would have meaning beyond proving an arrow of time exists outside of our consciousness. The link between clocks and thermodynamics may also reflect time on a smaller scale. If there is a limit on how accurately we can resolve time, could this be a sign that time itself isn’t perfectly smooth, but instead lumpy – packed into tiny units in the same way that light comes in photons?

    Answering this could be tricky. To probe space-time at this tiniest of scales, below those we can currently reach with our best particle accelerators, would require vast amounts of energy. At a certain level of energy, you would expect to create a black hole that would swallow the entire experiment, suggesting it is impossible to resolve time perfectly. “You end up with a sort of fundamental limit on the sensitivity to which you can measure a time interval,” says Adlam. This might be related to the limit caused by thermodynamics, she says, but the link isn’t clear yet.

    “Could this be a sign that time is not perfectly smooth, but lumpy – packed into tiny units?”

    Probing time at minuscule scales is exciting, but what Huber is most thrilled about relates to quantum mechanics and a mystery called the measurement problem. “I have a long-standing obsession with it,” he says.

    Unlike relativity, in which time is local and relative, quantum mechanics assumes there is a universal background time. Time in quantum mechanics doesn’t have an arrow: equations work equally well forwards as backwards in time. But sometimes this reversibility can be broken. When we measure a quantum system, the act of measuring causes the system to collapse from a superposition, a mix of different possible states, into a specific outcome. This cannot be reversed, creating an arrow of time. How time manages both to have, and not have, an arrow is just one of quantum mechanics’ many puzzles. But if the thermodynamic arrow can explain our perceptual time arrow, maybe it can explain the quantum one too.

    This is what Huber wants to tackle next. We know that whenever we measure something, we affect it, but the nitty-gritty of this process is often ignored in quantum mechanics. According to Huber, the act of measuring should create a flow of energy that may be best described by the laws of thermodynamics. “I think the measurement postulate is the second law in disguise,” he says. Perhaps quantum measurements, like clocks, create rising entropy and hence an emergent arrow of time.



    please log in to view this image

    The most accurate atomic clocks are powered by lasers

    G.E. Marti/JILA



    Erker, on the other hand, points out the research could also help to test ideas that combine the notoriously clashing theories of quantum mechanics and general relativity into a quantum theory of gravity. Such tests are extremely hard. Because gravity is so weak, you either need to put massive objects in a quantum superposition state to probe gravitational effects,which is tricky and has only been done in molecules of up to 2000 atoms. Or you need to be able to make incredibly precise measurements – and quantum clocks could help with that. “If we could build clocks that are accurate on very short timescales, we could actually build tabletop quantum experiments that test for gravitational effects,” says Erker.

    Any theory that explains gravity and quantum mechanics needs to describe how clocks work on the quantum scale. “All this research on understanding what clocks really are and how they kind of interact with quantum mechanics and with relativity is probably an important step to understanding how those theories fit together,” says Adlam.

    Bergson and Einstein’s debate cost the physicist the Nobel prize for general relativity. The president of the Nobel committee said that, while it was complex, “it will be no secret that the famous philosopher Bergson in Paris has challenged this theory”. Instead, Einstein won for the less-glamorous photoelectric effect. But a century on, Einstein now seems the real winner of the debate. The next question is whether there will ever be a way to merge his theory of general relativity with quantum mechanics. On that, only time will tell.

    More on these topics:

     
    #852
  13. Prince Knut

    Prince Knut GC Thread Terminator

    Joined:
    May 23, 2011
    Messages:
    25,525
    Likes Received:
    12,875
    2022 preview: Large Hadron Collider will reach for the edge of physics
    TECHNOLOGY 29 December 2021
    By Matthew Sparkes



    please log in to view this image

    The Large Hadron Collider has been shut down for upgrades since 2018

    CERN



    THE Large Hadron Collider (LHC) at CERN near Geneva, Switzerland, will start running again after a three-year shutdown and delays due to the covid-19 pandemic. The particle collider – known for its role in the discovery of the Higgs boson, which gives mass to all other fundamental particles – will return in 2022 with upgrades that give it a power boost.

    Work has been under way to conduct tests on the collider and calibrate new equipment. Now, it is gearing up for experiments that could give physicists the data needed to expand the standard model, our best description of how particles and forces interact.

    Phil Allport at the University of Birmingham in the UK says the upgrades could allow new measurements that give us insight into the way the Higgs boson decays, leading to a better understanding of how it fits into the standard model.


    “These measurements shed light on what’s happening at the highest energies that we can reach, which tells us about phenomena in the very early universe,” he says. They will also allow us to test ideas that try to account for things that aren’t fully described by the standard model, he says.

    CERN and Mont Blanc: Explore particle physics and glaciers in Switzerland on a New Scientist Discovery Tour
    This includes mysteries that have plagued physicists for decades, such as the so-called hierarchy problem, which deals with the vast discrepancy between the mass of the Higgs and those of other fundamental particles, plus dark energy and dark matter, the unexplained phenomena that make up most of the universe.

    “All of these things require extensions to the standard model of particle physics to accommodate, and all of those theories make predictions. And the best place to look to test those predictions is usually in the highest energies achievable,” says Allport. He says the LHC upgrades also pave the way to entirely new observations that signal a departure from the standard model.


    Part of the upgrade work has been to increase the power of the injectors that supply highly accelerated particle beams to the collider. Prior to the last shutdown in 2018, protons could reach an energy of 6.5 teraelectronvolts, but the upgrades mean this can now be pushed to 6.8 teraelectronvolts.

    Rende Steerenberg at CERN says that these more powerful beams will cause collisions at higher energies than ever before, and other upgrades in the future will also allow more particles to be collided at the same time.

    There are already plans for further improvements in 2024, which will narrow the LHC’s beams and drastically increase the number of collisions that take place. The 2018 run saw around 40 collisions every time a pulse of protons passed each other, but upgrades will push this to between 120 to 250. At that point, the LHC will take on a new name, the High Luminosity Large Hadron Collider, and it should begin experiments in 2028.

    There are still many tests to be run before the power of the new components can be unleashed. Scientists at CERN hope to finish these by late February and then slowly ramp up to a small number of full-power collisions in May. The frequency of these collisions will be increased in June, which is when Steerenberg says “meaningful” physics will begin.
     
    #853
  14. Prince Knut

    Prince Knut GC Thread Terminator

    Joined:
    May 23, 2011
    Messages:
    25,525
    Likes Received:
    12,875
  15. Prince Knut

    Prince Knut GC Thread Terminator

    Joined:
    May 23, 2011
    Messages:
    25,525
    Likes Received:
    12,875
    Why haven't we heard from aliens? There is a reason for the silence
    The search for extraterrestrial intelligence has been going on for 60 years without success. Given the hurdles to interstellar communication, that's just a blink of an eye

    LIFE 17 November 2021
    By Abigail Beall

    please log in to view this image

    ESA/Hubble & NASA

    IN 1960, astronomer Frank Drake began an experiment. With a radio telescope, he studied two nearby sun-like stars, hoping to find signals that could only have been generated by life on planets orbiting these stars. He came up blank. In the six decades since Drake started the search for extra-terrestrial intelligence (SETI), astronomers have kept listening, carefully and systematically. Still, we have heard nothing.

    Read more: 13 of the most profound questions about the cosmos and ourselves
    One possibility is that there simply are no aliens out there – that we truly are alone. But this seems unlikely, given the vastness of the cosmos, with hundreds of billions of galaxies containing hundreds of billions of stars, most of which have at least one planet orbiting them, at least according to our burgeoning knowledge of exoplanetary systems in our own galactic neighbourhood.

    Jill Tarter, co-founder of the SETI Institute in California, says we haven’t listened for long enough or looked hard enough to make any such sweeping statements yet. Astronomers have studied all kinds of electromagnetic radiation – light, radio waves, gamma rays – looking for signals. Such a search has to cover all directions and distances in space, plus the different ways a signal might manifest itself, such as shifts in polarisation, frequency, modulation and intensity. Tarter sees these parameters as a multi-dimensional ocean. “When SETI turned 50, we had explored one glass of water from that ocean. By the time it turned 60 it was more like a small hot tub,” she says. “It’s getting better and faster all the time, but there’s a lot more to explore.”


    According to Beth Biller, an astronomer at the University of Edinburgh, UK, searching through time is the biggest challenge. Humans have only lived on Earth for the blink of an eye compared with the age of the universe, and we have only been broadcasting our presence with things like radio waves for just over a century.

    “The civilisation that you want to contact has to exist at the same time as your own civilisation,” says Biller, which given light’s finite speed of travel, could be thousands, millions or billions of years in the past once their signals reach us, depending on how far away from Earth you are looking. “When you’re talking about finding aliens, you just have to get a lot of timings correct,” she says. Electromagnetic waves from other worlds will radiate in all directions, so the further away we are, the fainter any signal will be. Even the closest neighbouring star system to Earth, Proxima Centauri, is more than 4 light years away, putting a big delay on any conversation.

    Even if a transmitting alien civilisation were close enough, we might not see it. Around 70 per cent of exoplanets have been found using the transit method, which involves observing the light from stars periodically dimming when planets pass in front of them. A study published in June 2021 by Lisa Kaltenegger, an astronomer at Cornell University in New York, and her colleagues turned this logic around to ask how likely aliens would be to see us using this method.


    They identified just over 2000 systems within about 300 light years of Earth that might see our planet in this way at some point between 5000 years ago and 5000 years from now. Within the list, there are seven stars with planets in the habitable “Goldilocks zone”, where it is the right temperature for liquid water on the surface, of which four are close enough to have already received radio waves. Most of them lie in a heavily populated area of space so far unexplored by exoplanet surveys, at least until NASA’s Transiting Exoplanet Survey Satellite (TESS) started operating in April 2021. “And yes, I gave them the star list to search for planets,” says Kaltenegger.

    Even a continued no-show might not tell us much. If alien life forms exist, it might be that intelligence or technology are rare. Perhaps technological civilisations are simply too combustible, liable to destroy themselves before they can make their presence unambiguously known. Perhaps they do know about us – but have decided to leave us alone.

    Or perhaps we are simply looking for the wrong thing, our focus on electromagnetic signals reflecting the state of our current technology. Why not gravitational signals, say – or something else entirely? “We may have to discover new physics before we get it right,” says Tarter.

    New Scientist audio
    You can now listen to many articles – look for the headhones icon in our app newscientist.com/app

    Take our expert-led online cosmology course revealing the biggest mysteries in the universe
     
    #855
    saintKlopp likes this.
  16. Prince Knut

    Prince Knut GC Thread Terminator

    Joined:
    May 23, 2011
    Messages:
    25,525
    Likes Received:
    12,875
    Apollo 17 rock sample hints the moon cooled faster than we thought
    A rock sample collected from the moon during the Apollo 17 mission in 1972 has been re-examined, and the results suggest the lunar surface might have cooled in just 20 million years

    SPACE 14 December 2021
    By Chen Ly



    please log in to view this image

    NASA image of Troctolite 76535

    NASA/Johnson Space Center

    A rock collected from the moon during the Apollo 17 mission in 1972 cooled to its current state much more rapidly than we thought, according to a reanalysis of the sample. The finding highlights the value of re-examining old lunar samples.


    Most of our current knowledge of lunar evolution comes from rocks collected by astronauts half a century ago during NASA’s Apollo programme, but they can still yield new information.

    William Nelson at the University of Hawaiʻi at Mānoa and his colleagues reinvestigated one of the collection’s most well-studied rocks, known as troctolite 76535. It weighs roughly 156 grams and is 5 centimetres across at its widest point. The sample is part of a group known as the magnesian suite (Mg-suite). These rocks represent some of the first stages of what is known as secondary crust formation, which happened when the lower parts of the moon’s mantle rose to the surface and crystallised.

    Read more: Apollo special: It’s the solar system, stupid
    Using high-resolution analytical techniques, Nelson and his team found that phosphorous was distributed through the sample fairly unevenly. This suggests that the rock may have cooled quite quickly, as the element didn’t have enough time to spread out uniformly within the rock before it solidified. Then, via computer modelling, the team deduced the sample must have taken around 20 million years to solidify from its initial molten state. This cooling time is significantly shorter than previous estimates, which were around 100 million years.


    The result shows that early lunar evolution is more complicated than we thought, say the researchers, although they say we need further research to determine if the cooling history of troctolite 76535 represents the entire Mg-suite, given that just a single example has been reanalysed. “This is a sample size of one right now,” says Nelson.

    “There’s still value to be had in looking back at old samples to try to get a good idea of how the moon as a whole formed,” says Nelson. “You can always go back and reanalyse old data sets with new techniques to pull out new nuggets of information.”
     
    #856
  17. M!LK

    M!LK Well-Known Member

    Joined:
    Nov 12, 2021
    Messages:
    8,602
    Likes Received:
    6,561
    There are finite resources in the galaxy and organisms try to spread and reproduce. Logically speaking if two civilizations are spreading across the galaxy they will compete for resources. Strong will eliminate weak to eliminate the chance for weak to grow stronger than them.

    We're idiots for trying to broadcast our existence to the other stars.
     
    #857
    Prince Knut likes this.
  18. Prince Knut

    Prince Knut GC Thread Terminator

    Joined:
    May 23, 2011
    Messages:
    25,525
    Likes Received:
    12,875
    That's Hawking's point of view. But I've always thought that if they've mastered the science of producing enough energy to warp spacetime and/or use inter-dimensional travel, then they'll hardly need to come across the galaxy to nick our nitrogen. There's unlimited energy at the heart of each galaxy if you can tap into the event horizon of a super-massive black hole.
     
    #858
  19. M!LK

    M!LK Well-Known Member

    Joined:
    Nov 12, 2021
    Messages:
    8,602
    Likes Received:
    6,561
    Well it's not just that. We almost certainly would have no method of communication. I know there's the whole trope about communicating with maths and prime numbers, etc, that might work as a an intentional announcement that one exists but nothing more.

    If we received a signal there would be no way to know if the signal was representative of a written language, an encryption of chemicals intended to be smelled, represented flashing lights or colours, or even genetic instructions for a gene system we don't understand.

    You could put all the world's computers behind it and there is little chance we could understand one iota. A signal received could mean "hi, friend" or a declaration of war.

    Even a "me Tarzan, you Jane" approach in a face to face meeting probably wouldn't work, especially if the alien had no fingers or concept of pointing. It's fanciful to think one could communicate between the stars with a species that shares zero culture or history with our own.


    Heck, we share a planet with dolphins. We know Dolphins have language. They can communicate instructions and have grammer that includes complexities such as tense. The only thing we can understand about Dolphin language despite sharing a planet and biology with them for millenia is that they say "hi" by identifying themselves with a name as a greeting.

    If we can't understand dolphins who we share a planet and biology with I have zero confidence we will be able to communicate with aliens.

    Too much danger communicating. Best case scenario we encounter space hippies. Worst case scenario space Nazis. Not worth announcing oneself Incase we get the space Nazis showing up on our doorstep.
     
    #859
    Prince Knut likes this.
  20. Prince Knut

    Prince Knut GC Thread Terminator

    Joined:
    May 23, 2011
    Messages:
    25,525
    Likes Received:
    12,875
    Hmm, yeah and no. Bees communicate. So do dogs, and birds. Ants too, chemically. What I'm saying is that to transfer matter and information over vast distances, or shortcut through other dimensions, involves immense 'knowledge' that even we don't possess, let alone aliens. By time either us or them obtained it, why would need to exploit the other entity? Curiosity, perhaps, but all I'm saying is that by time you've acquired such knowledge, you've no need to exploit or steal something else's resources when there's so much 'free' energy on tap.
     
    #860
    M!LK likes this.

Share This Page