miércoles, 19 de julio de 2017

How scientists reacted to the US leaving the Paris climate agreement

What the United States' departure from the historic pact means for efforts to fight global warming.

Article tools

Luke Sharrett/Bloomberg/Getty
US President Donald Trump wants to boost the US coal industry.
Nature rounds up reaction from researchers around the world to US President Donald Trump's decision to pull the United States out of the Paris climate agreement.

Jane Lubchenco, marine ecologist at Oregon State University in Corvallis and former administrator of the US National Oceanic and Atmospheric Administration:

Where to start? President Trump’s decision to withdraw from the Paris Agreement shows a blatant disregard for the wishes of most Americans and business leaders, an irresponsible and callous dismissal of the health, safety and economic well-being of Americans, a moral emptiness in ignoring impacts to the poorest people in the US and around the world, and gross ignorance about overwhelming scientific evidence. Far from “protecting America” as the president stated, withdrawing from Paris will make America more vulnerable and diminish its world leadership. It is terrifying that the individual who should be leading the rest of the world is so arrogant and irresponsible.
Our collective future and that of much of the rest of life on Earth depends in part on confronting climate change and ocean acidification. Doing so requires global collective action. It’s hard to imagine anyone consciously choosing to leave a legacy of impoverishment, economic disruption, increasingly bizarre weather, health impacts ranging from heat strokes to spread of diseases, rising sea levels and flooding — but that is just what the president has done. Moreover, the new path and the president’s proposed budget would forego significant economic opportunities.
Fortunately, mayors, governors, faith leaders, scientists and business executives understand what is at risk, respect the scientific evidence, and see the powerful economic potential and moral imperative in shifting to renewable energy, preparing to adapt to changes already under way, and investing in science and monitoring to guide future decisions. There is strong economic momentum to continue these actions, but they would have been accelerated and more effective with strong action and forceful leadership from the president. Alas, he has chosen instead to stick his head in the sand.

Jean-Pascal van Ypersele, climate scientist at the Catholic University of Louvain in Louvain-la-Neuve, Belgium, and former vice-chair of the Intergovernmental Panel on Climate Change (IPCC):

President Trump's decision to introduce a request to leave the Paris agreement in 2020 is regrettable. It negates both the results of (1) serious scientific analyses (many made by US scientists) about the urgency to address the climate change problem; and (2) the rigorous assessment made by the IPCC about the technical and socio-economic aspects of response options, including their significant co-benefits in other areas like air quality, energy security, health or job creation.
President Trump's speech attempting to justify his decision was an amazing concentrate of some of the worst climate confusers' and fossil lobbyists' arguments.
The United States has played a very important role over the years to foster and nurture quality scientific research about the causes and processes of climate change, the potential risks and the response options. It is a shame that this leadership by the US is temporarily lost. Others in Europe, Asia and emerging economies will most likely compensate for this loss, transforming a difficulty into an opportunity.
Almost 150 countries, representing close to 85% of greenhouse-gas emissions, have now ratified the Paris agreement. Removing the US contribution from this total still leaves almost two-thirds of the emissions covered by the remaining countries, which have confirmed their plans to honour the agreement. This means that the transition to a low-carbon economy, now seen as an opportunity by many, will continue unabated, with or without the US.

Susanne Dröge, climate-policy researcher at the German Institute for International and Security Affairs in Berlin:

The US pull-out is bad news for the international climate process. The United Nations negotiations need to focus on implementation. This will become more difficult, also because it is unclear how Trump wants to renegotiate the agreement. Political attention is absorbed due to the US move, attention that is needed for much more important issues such as bringing climate action forward.

Thomas Stocker, former co-chair of climate science for the IPCC, and climate and environmental physicist at the University of Bern, Switzerland:

Trump’s decision to ignore scientific facts of climate disruption and the high risks of climate-change impacts is irresponsible not only towards his own people but to all people and life on this planet. The US administration prefers old technology over innovation and transformation. It is rejecting the enormous benefits and returns that leadership in the next industrial revolution — decarbonization — has to offer.
The United States is the second-biggest emitter of carbon dioxide worldwide (and has contributed, with Europe, 52% of the share of cumulative carbon emissions since industrialization). It is withdrawing from its historical responsibility to reduce greenhouse-gas emissions and lead the way forward. Given the continuous commitment of most countries to reduce emissions, and the firm leadership of Europe, China and Russia in shaping the transformation towards a decarbonized economy, the United States runs the risk of being left behind and missing one of the greatest economic opportunities of our time.

Susan Lozier, oceanographer at Duke University in Durham, North Carolina:

Trump’s decision is as short-sighted as it is disheartening. The oceans already hold about 35% of the carbon dioxide that has been released to the atmosphere since the Industrial Revolution. Nothing good for the ocean and the life it contains comes from this storage. Whether you simply admire marine life or count on it for your livelihood, this decision shouldn’t sit well. An already fragile ocean is further imperilled.

Kevin Anderson, deputy director of the Tyndall Centre for Climate Change Research in Manchester, UK:

Beneath the veil of the low-carbon rhetoric of the Paris agreement, there is no evidence of a mitigation agenda even approaching the scale of our international obligations. Trump’s ostensibly reckless decision can be used either as a further excuse for continued apathy or as a catalyst for transforming our comfortable rhetoric into meaningful and timely action. In that regard, Trump’s ignorant blunderings can inadvertently be a force for good. Channelled positively, it could yet oblige the rest of us to forego our increasing reliance on speculative technologies and incremental carbon prices and begin to shape a mitigation agenda that is fit for purpose.
We need to take Trump at face value. If he is successful in returning the US to a coal-based economy (and that looks unlikely), then the European Union needs to borrow his ‘protectionist’ cloak and put in place carbon standards for imported goods.
Finally, let’s keep Trump in context. US states and cities have considerable devolved powers — and many of their leaders continue to favour climate science.

Joeri Rogelj, energy researcher at the International Institute for Applied Systems Analysis in Laxenburg, Austria:

The US withdrawing from the Paris agreement is damaging for international collaborative efforts to limit climate change, but will likely be most damaging to the US economy itself. The US has decided to sideline itself, internationally, diplomatically and morally — not to prepare itself for the future, but to gaze into the past for a few more years. Many other major economies, including China and the European Union, have indicated their strong commitment to implementing the climate agreement. This signal will spur innovation and business development in these regions. However, the US government refuses to give US businesses such a clear sense of direction and is disregarding the most robust scientific evidence by doing so. By setting research, innovation and business priorities based on misleading short-term political goals, the US will miss the boat and might become a laggard in the global technology and innovation landscape.
The climate issue is a global and a cumulative problem that was not solved in one go with the Paris agreement, but requires incremental updates and adjustments of climate action. To halt climate change, global carbon dioxide emissions need to be capped and annual emissions need to be brought to zero. One country failing on its commitments thus implies that deeper emissions cuts are required in other regions or later in the future. This makes the problem harder and less equitable to solve.

Oliver Geden, visiting research fellow at the Institute for Science, Innovation and Society, University of Oxford, UK:

The United States gave up climate leadership on the day of Trump's inauguration. In March, Trump announced his rollback of Obama-era climate regulations. So it’s been clear for some time that the US federal government is not going to act on climate change in the foreseeable future. Withdrawing from the Paris agreement is just another step, although a highly symbolic one.
For now, it seems that this step reunites the rest of the world, but only on the symbolic level. It is quite easy for a government to declare that it will stick to the Paris agreement. But in a regime of bottom-up climate policy that still aims to achieve top-down temperature targets, other governments would need to step up and declare that they increase their mitigation pledges — and act accordingly. That's obviously the harder thing to do.

Katharine Hayhoe, director of the Climate Science Center at Texas Tech University in Lubbock:

The biggest loser from the decision could be the United States itself. Why? Because although the Paris agreement is a climate treaty, a triumph for evidence-based decision-making, it’s also much more: a trade agreement, an investment blueprint and a strong incentive for innovation in the energy and the economy of the future.
Earlier this week, India broke its own record for the lowest bids for electricity from solar power. Last month, Ernst & Young listed its most attractive markets for renewables: the United States came third, behind China and India. And earlier this year, China announced a US$360-billion investment in clean energy to create 13 million new jobs. The US announcement shows that it will be doing its best to turn back the clock, while the rest of the world accelerates into the future.
It’s true that federal policy is only one piece of the pie, and not even the biggest one. Cities, states and private industry have arguably played an even more important role in shaping US technological innovation, energy mix and carbon emissions over the past ten years, even under proactive federal climate policy. But Trump’s announcement sends a strong message that the US would rather be one of only two nations in the world that is not interested in preventing “dangerous anthropogenic interference with the climate system”. That other nation? War-torn Syria. (Note that Nicaragua is also opting out of the agreement — but in that case it’s because it wants to do more, not less.)

Atte Korhola, climate-policy and environmental-change researcher at the University of Helsinki, Finland:

The US withdrawal from the Paris climate agreement is very disappointing and unfavourable for the United States and the rest of the world. Many climate scientists consider the Paris agreement insufficient for limiting warming to 2 °C, so the task will be all the harder now. However, international climate agreements have not been very effective so far in reducing emissions, so there is still hope that the United States will proceed on other fronts, such as through bilateral agreements, clean-tech development and investing in new ‘negative emissions’ technologies.
But the plans by the Trump administration to cut more than 30% from the Environmental Protection Agency’s budget and about 70% of the funding for renewable-energy research and development unfortunately don’t point in this direction. The situation in all respects is quite depressing. The only hope is that the US states, cities and companies will continue their effective work to cut emissions.

Benjamin Santer, climate scientist at Lawrence Livermore National Laboratory in California:

In Shakespeare's Julius Caesar, Brutus said these famous lines: "There is a tide in the affairs of men. Which, taken at the flood, leads on to fortune; Omitted, all the voyage of their life is bound in shallows and in miseries."
Today, the United States pulled out of the Paris climate agreement and missed the rising tide. Far from "Making America Great Again", this decision condemns the United States to becoming one of the 'has-beens' of history. We will become increasingly irrelevant to the rest of the world. They are going forward; we are going backward.

Hans Joachim Schellnhuber, director of the Potsdam Institute for Climate Impact Research in Potsdam, Germany:

It will not substantially hamper global climate progress if the US really quits the Paris agreement, but it will hurt the American economy and society alike. China and Europe have become world leaders on the path towards green development already and will strengthen their position if the US slips back at the national level. Innovative states such as California, the world's sixth-largest economy, will keep going for climate action, however. The Washington people around Trump hide in the trenches of the past instead of building the future. They fail to recognize that the climate wars are over, while the race for sustainable prosperity is on.

David Victor, climate-policy expert at the University of California, San Diego:

The odds of other countries renegotiating Paris are low to zero. The whole structure of the Paris agreement is to allow countries to set their own commitments. So there is nobody to negotiate with if a country needs to adjust. This claim that the problem with Paris is that the deal wasn’t struck properly is a disingenuous argument that is not informed by how Paris actually works, nor by any reality about how the world actually crafts big complex deals.

Glen Peters, climate-policy expert at the Center for International Climate and Environmental Research in Oslo:

It seems that Trump and his advisers have completely misconceived what the Paris agreement is. All his reasons for pulling out were basically the concessions that forged the path to the creation of the Paris agreement. Paris is the agreement that Trump desires!
The genius of Paris is to allow countries to put forward emission pledges that they feel they can meet (Nationally Determined Contributions). The US pledge was put forward by the US, alone. Countries are already enacting their emissions pledges, and — as could be expected by the design of the Paris agreement — most countries show signs of exceeding their conservative emissions pledges. China looks like it may peak its emissions a decade earlier than pledged. India has slowed down on coal consumption and sped up on solar deployment. Even the US has made great strides in the past decade, and was poised to make more.
The irony is that Paris is working, because it is designed to be flexible to the national circumstances that Trump himself champions!

Myles Allen, climate scientist at the University of Oxford, UK:

The Paris agreement is far from perfect, and one of its problems, as we are seeing now, is the lack of any real penalty for pulling out. Talk of trade sanctions is pure hyperbole and the last thing the world needs right now. But perhaps it is time to think about a simple product label: “Made in and sourced from regions that support the Paris climate agreement.” With California and Oregon insisting they will abide by the terms of the Paris agreement anyway, we could then have an interesting discussion about whether and how this could be stuck on Californian orange juice — or computers containing Intel chips.
Painful though it may be for the agreement’s supporters, acknowledging that it isn’t perfect must also be part of the response to this proposal to renegotiate the US terms of participation. Some, no doubt, will see this as just a distraction tactic. Others would argue that even to begin to negotiate would be to deliver Trump an ill-deserved political “win”. But thinking beyond 2020, we will eventually need to work out how to make the agreement both more effective and more acceptable to nations, companies and individuals that own substantial fossil-fuel reserves — or the US won’t be the last to leave.

Benjamin Sanderson, climate modeller at the National Center for Atmospheric Research in Boulder, Colorado:

Today's announcement that the US will depart from the Paris agreement is unfortunate, but it is no time for fatalism. From this point forward, there are now large uncertainties in global mitigation efforts over the coming years. The long-term evolution of the climate hinges on what other countries, and agents both within and outside of the US, do in response to the US departure from the agreement.
A complete failure of the agreement at this point, with business-as-usual growth for another decade, would almost certainly commit the planet to significantly more warming than the Paris goals, and the human consequences of this would be catastrophic. However, some major remaining signatories have expressed a commitment to increasing mitigation goals, and within the US, many states, cities and some of the country's largest companies are committed to mitigation irrespective of the US participation in the agreement.
Decisions made today are made in the context of confident projections of future warming with continued emissions, but clearly there is more to do to better characterize the human and economic consequences of delaying action on climate change and how to frame these issues in the context of other concerns. The role of the scientific community is more important than ever, both to continue to provide the best possible research to inform decisions, and to communicate any risks associated with further emissions in a publicly accessible fashion.

sábado, 15 de julio de 2017


LHC Physicists Unveil a Charming New Particle

 The discovery could offer fresh insight into how fundamental forces bind together subatomic particles
A view of CERN's Large Hadron Collider in Geneva, Switzerland. Credit: View Pictures Getty Images
Physicists using the Large Hadron Collider beauty (LHCb) experiment at CERN in Geneva, Switzerland, have discovered a new kind of heavy particle, they announced this week at a conference in Venice.
The particle, known as Xi-cc++ (pronounced “Ksī-CC plus-plus”), is composed of three smaller elementary particles called quarks—specifically, one lighter-weight “up” quark like those found in protons and neutrons as well as two “charm” quarks, which are a heavier and more exotic variety. 
(The designations “up” and “charm” are two of the six “flavors” physicists assigned to quarks based on the particles’ varying masses and charges.) The Standard Model of particle physics predicts Xi-cc++ and many other possible particles with various configurations of the six known flavors of quarks. But until now such “doubly charmed” particles had eluded conclusive detection. 
Further studies of the new particle—and other members of the doubly charmed particle family—could reinforce the Standard Model or lead to new vistas in particle physics. Either way, the new particle could be a tool to unlock a deeper understanding of the fundamental “strong” force that binds quarks together to form protons and neutrons, which in turn form atoms—as well as planets, stars, galaxies and people.
Any particle made of quarks is called a hadron. The world’s largest and most powerful particle accelerator, CERN’s Large Hadron Collider (LHC), slams these particles together in search of new particles and interactions. Hadrons fall into two broad families: mesons, exotic particles with one quark and one antiquark; and baryons, particles composed of three quarks. The new Xi-cc++ particle is a baryon. But due to its doubly charmed nature it is almost four times heavier than more familiar baryons such as protons and neutrons, which are made up entirely of light quarks rather than heavy ones. 
“Finding a doubly heavy quark baryon is of great interest, as it will provide a unique tool to further probe quantum chromodynamics [QCD]—the theory that describes the strong [force], one of the four fundamental forces,” LHCb spokesperson Giovanni Passaleva said in a statement. “Such particles will thus help us improve the predictive power of our theories.”
The featherweight quark triplets within protons and neutrons all uniformly zip around one another at nearly the speed of light, making them very challenging to study. In a Xi-cc++ particle, the sole light quark whips at high speed around the heavier, slower-moving heavy quark pair, creating a situation easier for physicists to investigate. 
The situation, says former LHCb spokesperson and University of Oxford physicist Guy Wilkinson, is roughly analogous to a planetary system in which the light quark is akin to a planet orbiting a binary pair of massive stars.
New Particle
Credit: Amanda Montañez
Led by University of Glasgow physicist Patrick Spradlin, the LHCb team found evidence of more than 300 of the new particles in data collected last year by the experiment, teasing out their signals from a dense forest of more common particles produced by high-energy proton collisions at the LHC. 

Specifically, they looked for a telltale distribution of “daughter” particles, including other baryons as well as kaons and pions—exotic particles produced by the decay of short-lived Xi-cc++ particles. The distributions they observed show not only that the LHC’s collisions are producing Xi-cc++ particles but also hint that other researchers’ previous claims of double-charm particle production may be spurious.
In 2002 researchers using the SELEX experiment at the Fermilab accelerator in Illinois announced they had found a similar particle. That detection, however, was just below the threshold of unassailable statistical significance, and the putative particle’s estimated mass was wildly out of sync with predictions. 
After other facilities failed to confirm the results, many theorists began questioning the claim. By contrast, the signal of the LHCb’s newfound particle “is statistically overwhelming and matches very nicely with the theoretical expectations,” Wilkinson says. “It looks, smells and tastes like a doubly charmed baryon should.” The LHCb team’s findings have been submitted to Physical Review Letters.
With the detection of the new, heavy particle firmly in hand, physicists at the LHC are now producing more of these particles to precisely measure their lifetimes and learn exactly how often they are created in collisions. 
Next, Wilkinson says, the LHCb experiment will seek out other postulated members of the doubly charmed family, such as the Xi-cc+ and Omega-cc particles. “All these results can be compared against predictions to test QCD,” he says. “There are exciting times ahead!”

martes, 27 de junio de 2017

See Hypervelocity stars arosinhg from the Milky Way


26 June 2017
With the help of software that mimics a human brain, ESA's Gaia satellite spotted six stars zipping at high speed from the centre of our Galaxy to its outskirts. 

This could provide key information about some of the most obscure regions of the Milky Way.
Stars speeding through the Galaxy. Credit: ESA, CC BY-SA 3.0 IGO
Our galactic home, the Milky Way, houses more than a hundred billion stars, all kept together by gravity. Most are located in a flattened structure – the Galactic disc – with a bulge at its centre, while the remaining stars are distributed in a wider spherical halo extending out to about 650 000 light-years from the centre.
Stars are not motionless in the Galaxy but move around its centre with a variety of velocities depending on their location – for example, the Sun orbits at about 220 km/s, while the average in the halo is of about 150 km/s.
Occasionally, a few stars exceed these already quite impressive velocities.
Some are accelerated by a close stellar encounter or the supernova explosion of a stellar companion, resulting in runaway stars with speeds up to a few hundred km/s above the average.
A new class of high-speed stars was discovered just over a decade ago. Swooping through the Galaxy at several hundred of km/s, they are the result of past interactions with the supermassive black hole that sits at the centre of the Milky Way and, with a mass of four million Suns, governs the orbits of stars in its vicinity.
"These hypervelocity stars are extremely important to study the overall structure of our Milky Way," says Elena Maria Rossi from Leiden University in the Netherlands, who presented Gaia's discovery of six new such stars today at the European Week of Astronomy and Space Science in Prague, Czech Republic.
Catching speeding stars. Click here for details and large versions of the video.
Credit: ESA/Gaia/DPAC
"These are stars that have travelled great distances through the Galaxy but can be traced back to its core – an area so dense and obscured by interstellar gas and dust that it is normally very difficult to observe – so they yield crucial information about the gravitational field of the Milky Way from the centre to its outskirts."
Unfortunately, fast-moving stars are extremely difficult to find in the stellar haystack of the Milky Way, as current surveys list the speed of at most a few hundred thousand stars.
Artist's impression of Gaia. Credit: ESA/ATG medialab; background image: ESO/S. Brunier
To find them, scientists have been looking for young, massive stars that would stand out as interlopers in the old stellar population of the Galactic halo. Given away by their out-of-place age, these stars are likely to have received an extra kick to reach the halo. Further measurements of their speeds and estimates of their past paths can confirm if they are indeed hypervelocity stars that were shoved away from the centre of the Milky Way.
So far, only 20 such stars have been spotted. Owing to the specific selection of this method, these are all young stars with a mass 2.5 to 4 times that of the Sun. However, scientists believe that many more stars of other ages or masses are speeding through the Galaxy but remain unrevealed by this type of search.
The billion-star census being performed by Gaia offers a unique opportunity, so Elena and her collaborators started wondering how to use such a vast dataset to optimise the search for fast-moving stars.
After testing various methods, they turned to software through which the computer learns from previous experience.
"In the end, we chose to use an artificial neural network, which is software designed to mimic how our brain works," explains Tommaso Marchetti, PhD student at Leiden University and lead author of the paper describing the results published in Monthly Notices of the Royal Astronomical Society.
"After proper 'training', it can learn how to recognise certain objects or patterns in a huge dataset. In our case, we taught it to spot hypervelocity stars in a stellar catalogue like the one compiled with Gaia."
As part of Elena's research project to study these stars, the team started developing and training this program in the first half of 2016, in order to be ready for the first release of Gaia data a few months later, on 14 September.
Gaia's first sky map. Credit: ESA/Gaia/DPAC. Acknowledgement: A. Moitinho & M. Barros (CENTRA – University of Lisbon), on behalf of DPAC.
Besides a map of over a billion stellar positions, this first release included a smaller catalogue with distances and motions for two million stars, combining observations from Gaia's first year with those from ESA's Hipparcos mission, which charted the sky more than two decades ago. Referred to as the Tycho–Gaia Astrometric Solution, or TGAS, this resource is a taster for future catalogues that will be based solely on Gaia data.
"On the day of the data release, we ran our brand new algorithm on the two million stars of TGAS," says Elena.
"In just one hour, the artificial brain had already reduced the dataset to some 20 000 potential high-speed stars, reducing its size to about 1%.
"A further selection including only measurements above a certain precision in distance and motion brought this down to 80 candidate stars."
The team looked at these 80 stars in further detail. Since only information on the star's motion across the sky are included in the TGAS data, they had to find additional clues to infer their velocity, looking at previous stellar catalogues or performing new observations.
"Combining all these data, we found that six stars can be traced back to the Galactic Centre, all with velocities above 360 km/s," says Tommaso.
Most importantly, the scientists succeeded at probing a different population from the 20 stars that were already known: the newly identified stars all have lower masses, similar to the mass of our Sun.
One of the six stars seems to be speeding so fast, at over 500 km/s, that it is no longer bound by the gravity of the Galaxy and will eventually leave. But the other, slightly slower stars, are perhaps even more fascinating, as scientists are eager to learn what slowed them down – the invisible dark matter that is thought to pervade the Milky Way might also have played a role.
While the new program was optimised to search for stars that were accelerated at the centre of the Galaxy, it also identified five of the more traditional runaway stars, which owe their high speeds to stellar encounters elsewhere in the Milky Way.
"This result showcases the great potential of Gaia opening up new avenues to investigate the structure and dynamics of our Galaxy," says Anthony Brown from Leiden University, a co-author on the study and chair of the Gaia Data Processing and Analysis Consortium.
The scientists are looking forward to using data from the next Gaia release, which is planned for April 2018 and will include distances and motions on the sky for over a billion stars, as well as velocities for a subset.
Dealing with a billion stars, rather than the two million explored so far, is an enormous challenge, so the team is busy upgrading their program to handle such a huge catalogue and to uncover the many speeding stars that will be lurking in the data.
"The sheer number of stars probed by Gaia is an exciting but also challenging opportunity for astronomers, and we are glad to see that they are happily embracing the challenge," says Timo Prusti, Gaia project scientist at ESA.


"An artificial neural network to discover Hypervelocity stars: Candidates in Gaia DR1/TGAS," by T. Marchetti et al., is published in Monthly Notices of the Royal Astronomical Society.
These results were presented today at the European Week of Astronomy and Space Science in Prague, Czech Republic.


Elena Maria Rossi
Leiden Observatory
The Netherlands
Tel: +31 6 8112 1440
Email: emr@strw.leidenuniv.nl
Tommaso Marchetti
Leiden Observatory
The Netherlands
Tel:  +31 6 4776 9205
Email: marchetti@strw.leidenuniv.nl
Anthony Brown
Leiden Observatory, Leiden University
Leiden, The Netherlands
Email: brown@strw.leidenuniv.nl
Timo Prusti
Gaia Project Scientist
European Space Agency
Email: timo.prusti@esa.int
Markus Bauer
ESA Science and Robotic Exploration Communication Officer
Tel: +31 71 565 6799
Mob: +31 61 594 3 954
Email: markus.bauer@esa.int

Last Update: 26 June 2017


viernes, 23 de junio de 2017

A Theory of Reality as More Than the Sum of Its Parts

 New math shows how, contrary to conventional scientific wisdom, conscious beings and other macroscopic entities might have greater influence over the future than does the sum of their microscopic components.        Olena Shmahalo/Quanta Magazine

June 1, 2017

In his 1890 opus, The Principles of Psychology, William James invoked Romeo and Juliet to illustrate what makes conscious beings so different from the particles that make them up.
“Romeo wants Juliet as the filings want the magnet; and if no obstacles intervene he moves towards her by as straight a line as they,” James wrote. “But Romeo and Juliet, if a wall be built between them, do not remain idiotically pressing their faces against its opposite sides like the magnet and the filings. … Romeo soon finds a circuitous way, by scaling the wall or otherwise, of touching Juliet’s lips directly.”
Erik Hoel, a 29-year-old theoretical neuroscientist and writer, quoted the passage in a recent essay in which he laid out his new mathematical explanation of how consciousness and agency arise. The existence of agents — beings with intentions and goal-oriented behavior — has long seemed profoundly at odds with the reductionist assumption that all behavior arises from mechanistic interactions between particles. Agency doesn’t exist among the atoms, and so reductionism suggests agents don’t exist at all: that Romeo’s desires and psychological states are not the real causes of his actions, but merely approximate the unknowably complicated causes and effects between the atoms in his brain and surroundings.
Hoel’s theory, called “causal emergence,” roundly rejects this reductionist assumption.
“Causal emergence is a way of claiming that your agent description is really real,” said Hoel, a postdoctoral researcher at Columbia University who first proposed the idea with Larissa Albantakis and Giulio Tononi of the University of Wisconsin, Madison. “If you just say something like, ‘Oh, my atoms made me do it’ — well, that might not be true. And it might be provably not true.”
 Erik Hoel, a theoretical neuroscientist at Columbia University.
Julia Buntaine

Get Quanta Magazine delivered to your inbox

Using the mathematical language of information theory, Hoel and his collaborators claim to show that new causes — things that produce effects — can emerge at macroscopic scales. They say coarse-grained macroscopic states of a physical system (such as the psychological state of a brain) can have more causal power over the system’s future than a more detailed, fine-grained description of the system possibly could. Macroscopic states, such as desires or beliefs, “are not just shorthand for the real causes,” explained Simon DeDeo, an information theorist and cognitive scientist at Carnegie Mellon University and the Santa Fe Institute who is not involved in the work, “but it’s actually a description of the real causes, and a more fine-grained description would actually miss those causes.”
“To me, that seems like the right way to talk about it,” DeDeo said, “because we do want to attribute causal properties to higher-order events [and] things like mental states.”
Hoel and collaborators have been developing the mathematics behind their idea since 2013. In a May paper in the journal Entropy, Hoel placed causal emergence on a firmer theoretical footing by showing that macro scales gain causal power in exactly the same way, mathematically, that error-correcting codes increase the amount of information that can be sent over information channels. Just as codes reduce noise (and thus uncertainty) in transmitted data — Claude Shannon’s 1948 insight that formed the bedrock of information theory — Hoel claims that macro states also reduce noise and uncertainty in a system’s causal structure, strengthening causal relationships and making the system’s behavior more deterministic.
“I think it’s very significant,” George Ellis, a South African cosmologist who has also written about top-down causation in nature, said of Hoel’s new paper. Ellis thinks causal emergence could account for many emergent phenomena such as superconductivityand topological phases of matter. Collective systems like bird flocks and superorganisms — and even simple structures like crystals and waves — might also exhibit causal emergence, researchers said.
The work on causal emergence is not yet widely known among physicists, who for centuries have taken a reductionist view of nature and largely avoided further philosophical thinking on the matter. But at the interfaces between physics, biology, information theory and philosophy, where puzzles crop up, the new ideas have generated excitement. Their ultimate usefulness in explaining the world and its mysteries — including consciousness, other kinds of emergence, and the relationships between the micro and macro levels of reality — will come down to whether Hoel has nailed the notoriously tricky notion of causation: Namely, what’s a cause? “If you brought 20 practicing scientists into a room and asked what causation was, they would all disagree,” DeDeo said. “We get mixed up about it.”
A Theory of Cause
In a fatal drunk driving accident, what’s the cause of death? Doctors name a ruptured organ, while a psychologist blames impaired decision-making abilities and a sociologist points to permissive attitudes toward alcohol. Biologists, chemists and physicists, in turn, see ever more elemental causes. “Famously, Aristotle had a half-dozen notions of causes,” DeDeo said. “We as scientists have rejected all of them except things being in literal contact, touching and pushing.”
The true causes, to a physicist, are the fundamental forces acting between particles; all effects ripple out from there. Indeed, these forces, when they can be isolated, appear perfectly deterministic and reliable — physicists can predict with high precision the outcomes of particle collisions at the Large Hadron Collider, for instance. In this view, causes and effects become hard to predict from first principles only when there are too many variables to track.
It’s a bit of a bold thing to do to talk about the mathematics of causation.
Simon DeDeo
Furthermore, philosophers have argued that causal power existing at two scales at once would be twice what the world needs; to avoid double-counting, the “exclusion argument” says all causal power must originate at the micro level. But it’s almost always easier to discuss causes and effects in terms of macroscopic entities. When we look for the cause of a fatal car crash, or Romeo’s decision to start climbing, “it doesn’t seem right to go all the way down to microscopic scales of neurons firing,” DeDeo said. “That’s where Erik [Hoel] is jumping in. It’s a bit of a bold thing to do to talk about the mathematics of causation.”
Friendly and large-limbed, Hoel grew up reading books at Jabberwocky, his family’s bookstore in Newburyport, Massachusetts. He studied creative writing as an undergraduate and planned to become a writer. (He still writes fiction and has started a novel.) But he was also drawn to the question of consciousness — what it is, and why and how we have it — because he saw it as an immature scientific subject that allowed for creativity. For graduate school, he went to Madison, Wisconsin, to work with Tononi — the only person at the time, in Hoel’s view, who had a truly scientific theory of consciousness.
Tononi conceives of consciousness as information: bits that are encoded not in the states of individual neurons, but in the complex networking of neurons, which link together in the brain into larger and larger ensembles. Tononi argues that this special “integrated information” corresponds to the unified, integrated state that we experience as subjective awareness. Integrated information theory has gained prominence in the last few years, even as debates have ensued about whether it is an accurate and sufficient proxy for consciousness. But when Hoel first got to Madison in 2010, only the two of them were working on it there.
Giulio Tononi, a neuroscientist and psychiatrist at the University of Wisconsin, Madison, best known for his research on sleep and consciousness.   John Maniaci/UW Health
Tononi tasked Hoel with exploring the general mathematical relationship between scales and information. The scientists later focused on how the amount of integrated information in a neural network changes as you move up the hierarchy of spatiotemporal scales, looking at links between larger and larger groups of neurons. They hoped to figure out which ensemble size might be associated with maximum integrated information — and thus, possibly, with conscious thoughts and decisions. Hoel taught himself information theory and plunged into the philosophical debates around consciousness, reductionism and causation.
Hoel soon saw that understanding how consciousness emerges at macro scales would require a way of quantifying the causal power of brain states. He realized, he said, that “the best measure of causation is in bits.” He also read the works of the computer scientist and philosopher Judea Pearl, who developed a logical language for studying causal relationships in the 1990s called causal calculus. With Albantakis and Tononi, Hoel formalized a measure of causal power called “effective information,” which indicates how effectively a particular state influences the future state of a system. (Effective information can be used to help calculate integrated information, but it is simpler and more general and, as a measure of causal power, does not rely on Tononi’s other ideas about consciousness.)
The researchers showed that in simple models of neural networks, the amount of effective information increases as you coarse-grain over the neurons in the network — that is, treat groups of them as single units. The possible states of these interlinked units form a causal structure, where transitions between states can be mathematically modeled using so-called Markov chains. At a certain macroscopic scale, effective information peaks: This is the scale at which states of the system have the most causal power, predicting future states in the most reliable, effective manner. Coarse-grain further, and you start to lose important details about the system’s causal structure. Tononi and colleagues hypothesize that the scale of peak causation should correspond, in the brain, to the scale of conscious decisions; based on brain imaging studies, Albantakis guesses that this might happen at the scale of neuronal microcolumns, which consist of around 100 neurons.
                Causation is what you need to give structure to the universe.                                   
...Larissa  Albantakis, a theoretical neuroscientist at the University of Wisconsin,  Madison.        Sophia Loschky
Causal emergence is possible, Hoel explained, because of the randomness and redundancy that plagues the base scale of neurons. As a simple example, he said to imagine a network consisting of two groups of 10 neurons each. Each neuron in group A is linked to several neurons in group B, and when a neuron in group A fires, it usually causes one of the B neurons to fire as well. Exactly which linked neuron fires is unpredictable. If, say, the state of group A is {1,0,0,1,1,1,0,1,1,0}, where 1s and 0s represent neurons that do and don’t fire, respectively, the resulting state of group B can have myriad possible combinations of 1s and 0s. On average, six neurons in group B will fire, but which six is nearly random; the micro state is hopelessly indeterministic. Now, imagine that we coarse-grain over the system, so that this time, we group all the A neurons together and simply count the total number that fire. The state of group A is {6}. This state is highly likely to lead to the state of group B also being {6}. The macro state is more reliable and effective; calculations show it has more effective information.
 A real-world example cements the point. “Our life is very noisy,” Hoel said. “If you just give me your atomic state, it may be totally impossible to guess where your future [atomic] state will be in 12 hours. Try running that forward; there’s going to be so much noise, you’d have no idea. Now give a psychological description, or a physiological one: Where are you going to be in 12 hours?” he said (it was mid-day). “You’re going to be asleep — easy. So these higher-level relationships are the things that seem reliable. That would be a super simple example of causal emergence.”
For any given system, effective information peaks at the scale with the largest and most reliable causal structure. In addition to conscious agents, Hoel says this might pick out the natural scales of rocks, tsunamis, planets and all other objects that we normally notice in the world. “And the reason why we’re tuned into them evolutionarily [might be] because they are reliable and effective, but that also means they are causally emergent,” Hoel said.
Brain-imaging experiments are being planned in Madison and New York, where Hoel has joined the lab of the Columbia neuroscientist Rafael Yuste. Both groups will examine the brains of model organisms to try to home in on the spatiotemporal scales that have the most causal control over the future. Brain activity at these scales should most reliably predict future activity. As Hoel put it, “Where does the causal structure of the brain pop out?” If the data support their hypothesis, they’ll see the results as evidence of a more general fact of nature. “Agency or consciousness is where this idea becomes most obvious,” said William Marshall, a postdoctoral researcher in the Wisconsin group. “But if we do find that causal emergence is happening, the reductionist assumption would have to be re-evaluated, and that would have to be applied broadly.”
New Philosophical Thinking
Sara Walker, a physicist and astrobiologist at Arizona State University who studies the origins of life, hopes measures like effective information and integrated information will help define what she sees as the gray scale leading between nonlife and life (with viruses and cell cycles somewhere in the gray area). Walker has been collaborating with Tononi’s team on studies of real and artificial cell cycles, with preliminary indications that integrated information might correlate with being alive.

In other recent work, the Madison group has developed a way of measuring causal emergence called “black-boxing” that they say works well for something like a single neuron. A neuron isn’t simply the average of its component atoms and so isn’t amenable to coarse-graining. Black-boxing is like putting a box around a neuron and measuring the box’s overall inputs and outputs, instead of assuming anything about its inner workings. “Black-boxing is the truly general form of causal emergence and is especially important for biological and engineering systems,” Tononi said in an email. 
Walker is also a fan of Hoel’s new work tracing effective information and causal emergence to the foundations of information theory and Shannon’s noisy-channel theorem. “We’re in such deep conceptual territory it’s not really clear which direction to go,” she said, “so I think any bifurcations in this general area are good and constructive.”
Robert Bishop, a philosopher and physicist at Wheaton College, said, “My take on EI” —effective information — “is that it can be a useful measure of emergence but likely isn’t the only one.” Hoel’s measure has the charm of being simple, reflecting only reliability and the number of causal relationships, but according to Bishop, it could be one of several proxies for causation that apply in different situations.
Hoel’s ideas do not impress Scott Aaronson, a theoretical computer scientist at the University of Texas, Austin. He says causal emergence isn’t radical in its basic premise. After reading Hoel’s recent essay for the Foundational Questions Institute, “Agent Above, Atom Below” (the one that featured Romeo and Juliet), Aaronson said, “It was hard for me to find anything in the essay that the world’s most orthodox reductionist would disagree with. Yes, of course you want to pass to higher abstraction layers in order to make predictions, and to tell causal stories that are predictively useful — and the essay explains some of the reasons why.”
It didn’t seem so obvious to others, given how the exclusion argument has stymied efforts to get a handle on higher-level causation. Hoel says his arguments go further than Aaronson acknowledges in showing that “higher scales have provably more information and causal influence than their underlying ones. It’s the ‘provably’ part that’s hard and is directly opposite to most reductionist thinking.”
Moreover, causal emergence isn’t merely a claim about our descriptions or “causal stories” about the world, as Aaronson suggests. Hoel and his collaborators aim to show that higher-level causes — as well as agents and other macroscopic things — ontologically exist. The distinction relates to one that the philosopher David Chalmers makes about consciousness: There’s the “easy problem” of how neural circuitry gives rise to complex behaviors, and the “hard problem,” which asks, essentially, what distinguishes conscious beings from lifeless automatons. “Is EI measuring causal power of the kind that we feel that we have in action, the kind that we want our conscious experiences or selves to have?” said Hedda Hassel Mørch, a philosopher at New York University and a protégé of Chalmers’. She says it’s possible that effective information could “track real ontological emergence, but this requires some new philosophical thinking about the nature of laws, powers and how they relate.”
The criticism that hits Hoel and Albantakis the hardest is one physicists sometimes make upon hearing the idea: They assert that noise, the driving force behind causal emergence, doesn’t really exist; noise is just what physicists call all the stuff that their models leave out. “It’s a typical physics point of view,” Albantakis said, that if you knew the exact microscopic state of the entire universe, “then I can predict what happens until the end of time, and there is no reason to talk about something like cause-effect power.”
One rejoinder is that perfect knowledge of the universe isn’t possible, even in principle. But even if the universe could be thought of as a single unit evolving autonomously, this picture wouldn’t be informative. “What is left out there is to identify entities — things that exist,” Albantakis said. Causation “is really the measure or quantity that is necessary to identify where in this whole state of the universe do I have groups of elements that make up entities? … Causation is what you need to give structure to the universe.” Treating causes as real is a necessary tool for making sense of the world.
Maybe we sort of knew all along, as Aaronson contends, that higher scales wrest the controls from lower scales. But if these scientists are right, then causal emergence might be how that works, mathematically. “It’s like we cracked the door open,” Hoel said. “And actually proving that that door is a little bit open is very important. Because anyone can hand-wave and say, yeah, probably, maybe, and so on. But now you can say, ‘Here’s a system [that has these higher-level causal events]; prove me wrong on it.’”
Correction: This article was revised on June 1, 2017, to clarify that Erik Hoel was not Giulio Tononi’s first collaborator on integrated information theory and that Tononi, Hoel and Larissa Albantakis formalized, but did not devise, the measure used to assess causal emergence.

This article was reprinted on Wired.com.

SOURCE: https://www.quantamagazine.org/a-theory-of-reality-as-more-than-the-sum-of-its-parts-20170601/