Thursday, 5 June 2014

Astronomers discover first Thorne-Żytkow object, a bizarre type of hybrid star 


In a discovery decades in the making, scientists have detected the first of a “theoretical” class of stars first proposed in 1975 by physicist Kip Thorne and astronomer Anna Żytkow. Thorne-Żytkow objects (TŻOs) are hybrids of red supergiant and neutron stars that superficially resemble normal red supergiants, such as Betelgeuse in the constellation Orion. They differ, however, in their distinct chemical signatures that result from unique activity in their stellar interiors.

TŻOs are thought to be formed by the interaction of two massive stars―a red supergiant and a neutron star formed during a supernova explosion―in a close binary system. While the exact mechanism is uncertain, the most commonly held theory suggests that, during the evolutionary interaction of the two stars, the much more massive red supergiant essentially swallows the neutron star, which spirals into the core of the red supergiant.

While normal red supergiants derive their energy from nuclear fusion in their cores, TŻOs are powered by the unusual activity of the absorbed neutron stars in their cores. The discovery of this TŻO thus provides evidence of a model of stellar interiors previously undetected by astronomers.

Project leader Emily Levesque of the University of Colorado Boulder, who earlier this year was awarded the American Astronomical Society’s Annie Jump Cannon Award, said, “Studying these objects is exciting because it represents a completely new model of how stellar interiors can work. In these interiors we also have a new way of producing heavy elements in our universe. You've heard that everything is made of ‘star stuff’—inside these stars we might now have a new way to make some of it.”

The study, accepted for publication in the Monthly Notices of the Royal Astronomical Society Letters, is co-authored by Philip Massey, of Lowell Observatory in Flagstaff, Arizona; Anna Żytkow of the University of Cambridge in the U.K.; and Nidia Morrell of the Carnegie Observatories in La Serena, Chile.

The astronomers made their discovery with the 6.5-meter Magellan Clay telescope on Las Campanas, in Chile. They examined the spectrum of light emitted from apparent red supergiants, which tells them what elements are present. When the spectrum of one particular star—HV 2112 in the Small Magellanic Cloud―was first displayed, the observers were quite surprised by some of the unusual features. Morrell explained, “I don’t know what this is, but I know that I like it!”

When Levesque and her colleagues took a close look at the subtle lines in the spectrum they found that it contained excess rubidium, lithium and molybdenum. Past research has shown that normal stellar processes can create each of these elements. But high abundances of all three of these at the temperatures typical of red supergiants is a unique signature of TŻOs. “I am extremely happy that observational confirmation of our theoretical prediction has started to emerge,” Żytkow said. “Since Kip Thorne and I proposed our models of stars with neutron cores, people were not able to disprove our work. If theory is sound, experimental confirmation shows up sooner or later. So it was a matter of identification of a promising group of stars, getting telescope time and proceeding with the project.” The team is careful to point out that HV 2112 displays some chemical characteristics that don’t quite match theoretical models. Massey points out, “We could, of course, be wrong. There are some minor inconsistencies between some of the details of what we found and what theory predicts. But the theoretical predictions are quite old, and there have been a lot of improvements in the theory since then. Hopefully our discovery will spur additional work on the theoretical side now.” This work was partially supported by NASA and the National Science Foundation.

Surprisingly Strong Magnetic Fields Challenge Black Holes’ Pull 


A computer simulation of gas (in yellow) falling into a black hole (too small to be seen). Twin jets are also shown with magnetic field lines. Image credit: Alexander Tchekhovskoy, LBL - See more at: http://newscenter.lbl.gov/2014/06/04/black-holes/#sthash.M4M1jyGZ.dpuf


A new study of supermassive black holes at the centers of galaxies has found magnetic fields play an impressive role in the systems’ dynamics. In fact, in dozens of black holes surveyed, the magnetic field strength matched the force produced by the black holes’ powerful gravitational pull, says a team of scientists from the U.S. Department of Energy’s Lawrence Berkeley National Laboratory (Berkeley Lab) and Max Planck Institute for Radio Astronomy (MPIfR) in Bonn, Germany. The findings are published in this week’s issue of Nature. “This paper for the first time systematically measures the strength of magnetic fields near black holes,” says Alexander Tchekhovskoy, the Berkeley Lab researcher who helped interpret the observational data within the context of existing computational models. “This is important because we had no idea, and now we have evidence from not just one, not just two, but from 76 black holes.

Previously, Tchekhovskoy, who is also a postdoctoral fellow at the University of California, Berkeley, had developed computational models of black holes that included magnetic fields. His models suggested a black hole could sustain a magnetic field that was as strong as its gravity, but there was not yet observational evidence to support this prediction. With the two forces balancing out, a cloud of gas caught on top of the magnetic field would be spared the pull of gravity and instead levitate in place.

The magnetic field strength was confirmed by evidence from jets of gas that shoot away from supermassive black holes. Formed by magnetic fields, these jets produce a radio emission. “We realized that the radio emission from black holes’ jets can be used to measure the magnetic field strength near the black hold itself,” says Mohammad Zamaninasab, the lead author of the study, who did the work while at MPIfR.

Other research teams had previously collected radio-emission data from “radio-loud” galaxies using the Very Long Baseline Array, a vast network of radio telescopes in the United States. The researchers analyzed this pre-existing data to create radio-emission maps at different wavelengths. Shifts in jet features between different maps let them calculate the field strength near the black hole.

Based on the results, the team found not only that the measured magnetic fields can be as strong as a black hole’s gravity, but that they are also comparable in strength to those produced inside MRI machines found in hospitals–roughly 10,000 times greater than the field of the Earth itself.

Tchekhovskoy says the new results mean theorists must re-evaluate their understanding of black-hole behavior. “The magnetic fields are strong enough to dramatically alter how gas falls into black holes and how gas produces outflows that we do observe, much stronger than what has usually been assumed,” he says. “We need to go back and look at our models once again.”

Black Hole ‘Batteries’ Keep Blazars Going and Going


Astronomers studying two classes of black-hole-powered galaxies monitored by NASA's Fermi Gamma-ray Space Telescope have found evidence that they represent different sides of the same cosmic coin. By unraveling how these objects, called blazars, are distributed throughout the universe, the scientists suggest that apparently distinctive properties defining each class more likely reflect a change in the way the galaxies extract energy from their central black holes.

"We can think of one blazar class as a gas-guzzling car and the other as an energy-efficient electric vehicle," said lead researcher Marco Ajello, an astrophysicist at Clemson University in South Carolina. "Our results suggest that we're actually seeing hybrids, which tap into the energy of their black holes in different ways as they age."



What astronomers once thought were two blazar families may in fact be one, as shown in this artist's concept. Energy stored in the black hole during its salad days of intense accretion may later be tapped by the blazar to continue its high-energy emissions long after this gas has been depleted.
Image Credit: NASA's Goddard Space Flight Center


Active galaxies possess extraordinarily luminous cores powered by black holes containing millions or even billions of times the mass of the sun. As gas falls toward these supermassive black holes, it settles into an accretion disk and heats up. Near the brink of the black hole, through processes not yet well understood, some of the gas blasts out of the disk in jets moving in opposite directions at nearly the speed of light.

Blazars are the highest-energy type of active galaxy and emit light across the spectrum, from radio to gamma rays. They make up more than half of the discrete gamma-ray sources cataloged by Fermi's Large Area Telescope, which has detected more than 1,000 to date. Astronomers think blazars appear so intense because they happen to tip our way, bringing one jet nearly into our line of sight. Looking almost directly down the barrel of a particle jet moving near the speed of light, emissions from the jet and the region producing it dominate our view.

To be considered a blazar, an active galaxy must show either rapid changes in visible light on timescales as short as a few days, strong optical polarization, or glow brightly at radio wavelengths with a "flat spectrum" — that is, one exhibiting relatively little change in brightness among neighboring frequencies.

Astronomers have identified two models in the blazar line. One, known as flat-spectrum radio quasars (FSRQs), show strong emission from an active accretion disk, much higher luminosities, smaller black hole masses and lower particle acceleration in the jets. The other, called BL Lacs, are totally dominated by the jet emission, with the jet particles reaching much higher energy and the accretion disk emission either weak or absent.

Speaking at the American Astronomical Society meeting in Boston on Tuesday, Ajello said he and his team wanted to probe how the distribution of these objects changed over the course of cosmic history, but solid distance information for large numbers of gamma-ray-producing BL Lac objects was hard to come by.

"One of our most important tools for determining distance is the movement of spectral lines toward redder wavelengths as we look deeper into the cosmos," explained team member Dario Gasparrini, an astronomer at the Italian Space Agency's Science Data Center in Rome. "The weak disk emission from BL Lacs makes it extremely difficult to measure their redshift and therefore to establish a distance."

So the team undertook an extensive program of optical observations to measure the redshifts of BL Lac objects detected by Fermi.

"This project has taken several years and simply wouldn't have been possible without the extensive use of many ground-based observatories by our colleagues," said team member Roger Romani, an astrophysicist at the Kavli Institute for Particle Astrophysics and Cosmology, a facility run jointly by Stanford University and the SLAC National Accelerator Laboratory in Menlo Park, California.

The redshift survey included 25 nights on the Hobby-Eberly Telescope at McDonald Observatory in Texas, led by Romani; eight nights on the 200-inch telescope at Palomar Observatory and nine nights on the 10-meter Keck Telescope in Hawaii, led by Anthony Readhead at Caltech in Pasadena, California; and nine nights on telescopes at the European Southern Observatory in Chile, led by Garret Cotter at the University of Oxford in England. In addition, important observations were provided by the Chile-based GROND camera, led by Jochen Greiner at the Max Planck Institute for Extraterrestrial Physics in Garching, Germany, and the Ultraviolet/Optical Telescope on NASA's Swift satellite, led by Neil Gehrels at Goddard Space Flight Center in Greenbelt, Maryland.

With distances for about 200 BL Lacs in hand -- the largest and most comprehensive sample available to date -- the astronomers could compare their distribution across cosmic time with a similar sample of FSRQs. What emerged suggests that, starting around 5.6 billion years ago, FSRQs began to decline while BL Lacs underwent a steady increase in numbers. The rise is particularly noticeable among BL Lacs with the most extreme energies, which are known as high-synchrotron-peaked blazars based on a particular type of emission.

"What we think we're seeing here is a changeover from one style of extracting energy from the central black hole to another," adds Romani.

Large galaxies grew out of collisions and mergers with many smaller galaxies, and this process occurs with greater frequency as we look back in time. These collisions provided plentiful gas to the growing galaxy and kept the gas stirred up so it could more easily reach the central black hole, where it piled up into a vast, hot, and bright accretion disk like those seen in "gas-guzzling" FSRQs. Some of the gas near the hole powers a jet while the rest falls in and gradually increases the black hole's spin.

As the universe expands and the density of galaxies decreases, so do galaxy collisions and the fresh supply of gas they provide to the black hole. The accretion disk becomes depleted over time, but what's left is orbiting a faster-spinning and more massive black hole. These properties allow BL Lac objects to maintain a powerful jet even though relatively meager amounts of material are spiraling toward the black hole.

In effect, the energy of accretion from the galaxy's days as an FSRQ becomes stored in the increasing rotation and mass of its black hole, which acts much like a battery. When the gas-rich accretion disk all but disappears, the blazar taps into the black hole's stored energy that, despite a lower accretion rate, allows it to continue operating its particle jet and producing high-energy emissions as a BL Lac object.

One observational consequence of the hybrid blazar notion is that the luminosity of BL Lacs should decrease over time as the black hole loses energy and spins down.

The astronomers say they are eager to test this idea with larger blazar samples provided in part by Fermi's continuing all-sky survey. Understanding the details of this transition also will require better knowledge of the jet, the black hole mass and the galaxy environment for both blazar classes.

The Majorana nature of neutrinos and the neutrinoless double-beta decay:

No evidence of the double nature of neutrinos


The EXO-200 detector – Photo: SLAC


After two years of searching for a special radioactive decay that would provide an indication of new physics beyond the standard model, an experiment deep under ground near Carlsbad (New Mexico, USA) has so far found no evidence of its existence. If this decay indeed exists, its half-life must be more than a million-billion times longer than the age of the universe.


Neutrinos are tiny, neutral elementary particles that, contrary to the standard model of physics, have been proven to have mass. One possible explanation for this mass could be that neutrinos are their own antiparticles, so-called Majorana particles.

Though experimental evidence for this is still lacking, many theoretical extensions of the standard model of physics predict the Majorana nature of neutrinos. If this hypothesis proves to be true, many previously unanswered questions about the origin of our universe and the origin of matter could be answered.

650 meters of shielding


In the EXO-200 experiment (Enriched Xenon Observatory), which is operated in the U.S. state of New Mexico, 650 meters below the earth’s surface, scientists are looking for the evidence. Physicists from the research group of Professor Peter Fierlinger of the Excellence Cluster Universe at the Technische Universitaet Muenchen are major contributors to this experiment.

The most sensitive method to experimentally verify the Majorana question is the search for a process called "neutrinoless double-beta decay". This process is a special radioactive decay that may only occur if neutrinos are their own antiparticles.

Unprecedented accuracy


The EXO-200 experiment has searched for these decays over several years. From the fact that not one of these decays has been detected, the scientists can now deduce a lower limit for the half-life of the decay of at least 1025 years – around one million-billion years more than the age of the universe.

"Although this measurement attains unprecedented accuracy, the question about the nature of neutrinos can still not be answered," says Dr. Michael Marino, member of the research group of Professor Peter Fierlinger and responsible for the analysis of the now published data. "That's why this open issue remains one of the most exciting questions in physics."

This result demonstrates the high sensitivity of the detector and also the future potential of this method. Hence the EXO-200 measurements are also the basis for a much larger future experiment that finally could confirm or refute the Majorana nature of neutrinos.

International cooperation


The EXO-200 experiment uses liquid xenon that is enriched to 80.6 percent of xenon-136, an isotope that is allowed by theory to undergo neutrinoless double-beta decay. The experiment's location in the Waste Isolation Pilot Plant (WIPP) 650 meters below ground provides shielding against radioactive decays and cosmic radiation.

EXO-200 is a collaboration of research groups from Canada, Switzerland, South Korea, Russia and the USA; the Technische Universitaet Muenchen is the only German partner.

To Send Astronauts to Mars, NASA Needs New Strategy: Report



An artist's illustration a manned NASA Orion space capsule in orbit around Mars, with two other vehicles nearby. A National Research Council report released June 4, 2014 found that Mars should be the ultimate goal for NASA's human spaceflight program.
Credit: NASA/JSC


Landing astronauts on Mars should continue to be the ultimate goal for the United States' human spaceflight program, but a change in NASA's approach and a significant boost in funding are needed to make it happen, a new report finds.

A manned mission to Mars, specifically the Martian surface, is the most distant and difficult goal for astronauts that is still feasibly attainable within the foreseeable future, according to the nearly 300-page report by the National Research Council's Committee on Human Spaceflight. The report, entitled "Pathways to Exploration: Rationales and Approaches for a U.S. Program of Human Space Exploration," was released Wednesday (June 4).

The NRC committee found that in order to reach the Red Planet, NASA's current budget-driven, capability-based exploration strategy needs to be replaced by one that is guided forward by interim destinations, including possibly the moon. NASA is currently pursuing a path to Mars that omits a return to the lunar surface in favor of sending astronauts to a redirected asteroid by 2025, followed by sending a crew to orbit Mars by the mid-2030s

In a response to the report, NASA officials agreed with the committee's identification of Mars as the ultimate goal.

"NASA has made significant progress on many key elements that will be needed to reach Mars, and we continue on this path in collaboration with industry and other nations," space agency officials said in a statement. "We intend to thoroughly review the report and all of its recommendations."

Unlike previous reports that have sought to evaluate the path forward for U.S. space exploration and have ultimately recommended Mars as the goal, the research council's "Pathways to Exploration" focuses on making the goal obtainable, said Jonathan Lunine, the committee's other chairman and the director of the Center for Radiophysics and Space Research at Cornell University.

"Yes, the idea of Mars as the horizon goal is not new," Lunine said. "What's different about this report is that we're recommending an approach that will provide a robust way of getting to Mars in an endeavor that will take decades and hundreds of billions of dollars and, quite probably, human lives. It is the staying power of the pathways approach and its ability, essentially, to make the program resilient against changes that we think is a novel aspect of our report."

The report's broad perspective, taking into account such considerations as public opinion and the rationales for continued human spaceflight, also distinguishes this report from previous ones, Lunine said.

The committee found that no single rationale, either practical or aspirational, justified the further pursuit of human spaceflight. But if considered with the practical benefits, the aspirational rationales — including the survival of the human species through off-Earth settlement and the shared human desire to explore — could argue for it to continue, as long as the program adopts a stable and sustainable approach.

"So, in essence here, the whole is greater than the sum of the parts, and it is the aggregate of the aspirational and the pragmatic that, in the committee's opinion, motivate human spaceflight and human space exploration," Lunine said at the hearing.




Pathways to Mars


The report offers three different pathways to illustrate the trade-offs among affordability, schedule, developmental risk and the frequency of missions for different sequences of intermediate destinations. All of the pathways culminate in landing on the surface of Mars and have anywhere between three and six steps that include some combination of human missions to the asteroids, the moon and Mars' moons Phobos and Deimos.

While the committee was not asked to recommend a particular pathway to pursue, it found that a return to extended operations on the surface of the moon would make significant contributions to a strategy ultimately aimed at landing people on Mars, and that it would also likely provide a broad array of opportunities for international and commercial cooperation.

The report also identified 10 high-priority capabilities that should be addressed by current research and development activities, with a particular emphasis on Mars entry, descent, and landing, radiation safety, and in-space propulsion and power. These three capabilities, the committee said, will be the most difficult to develop in terms of costs, schedule, technical challenges, and gaps between current and needed abilities.

"I hope [our report] carries the national conversation forward in the direction of realism — realism about public opinion, realism about risk, realism about cost and the incredibly daunting technical challenges of the horizon goal [of going to Mars] that we believe the world embraces," Daniels said.

"We're optimistic," he concluded. "We believe the public will support it; we believe the rationales justify it; we believe the achievement would be monumental if it occurred. But we believe there is one, and possibly only one, way to get there, and we've offered it up in this report."

Astronomers Find a New Type of Planet: The "Mega-Earth"



An artist's illustration of the mega-Earth planet Kepler-10c, the"Godzilla of Earths" planet that is 2.3 times the size of Earth and 17 times heavier. The planet and its lava-world sibling Kepler 10b (background) orbit the star Kepler-10 about 570 light-years from Earth. Image released June 2, 2014.
Credit: David A. Aguilar (CfA)


Astronomers announced today that they have discovered a new type of planet - a rocky world weighing 17 times as much as Earth. Theorists believed such a world couldn't form because anything so hefty would grab hydrogen gas as it grew and become a Jupiter-like gas giant. This planet, though, is all solids and much bigger than previously discovered "super-Earths," making it a "mega-Earth."

"We were very surprised when we realized what we had found," says astronomer Xavier Dumusque of the Harvard-Smithsonian Center for Astrophysics (CfA), who led the data analysis and made the discovery.

"This is the Godzilla of Earths!" adds CfA researcher Dimitar Sasselov, director of the Harvard Origins of Life Initiative. "But unlike the movie monster, Kepler-10c has positive implications for life."

The team's finding was presented today in a press conference at a meeting of the American Astronomical Society (AAS).

The newfound mega-Earth, Kepler-10c, circles a sunlike star once every 45 days. It is located about 560 light-years from Earth in the constellation Draco. The system also hosts a 3-Earth-mass "lava world," Kepler-10b, in a remarkably fast, 20-hour orbit.

Kepler-10c was originally spotted by NASA's Kepler spacecraft. Kepler finds planets using the transit method, looking for a star that dims when a planet passes in front of it. By measuring the amount of dimming, astronomers can calculate the planet's physical size or diameter. However, Kepler can't tell whether a planet is rocky or gassy.

Kepler-10c was known to have a diameter of about 18,000 miles, 2.3 times as large as Earth. This suggested it fell into a category of planets known as mini-Neptunes, which have thick, gaseous envelopes.

The team used the HARPS-North instrument on the Telescopio Nazionale Galileo (TNG) in the Canary Islands to measure the mass of Kepler-10c. They found that it weighed 17 times as much as Earth - far more than expected. This showed that Kepler-10c must have a dense composition of rocks and other solids.

"Kepler-10c didn't lose its atmosphere over time. It's massive enough to have held onto one if it ever had it," explains Dumusque. "It must have formed the way we see it now."

Planet formation theories have a difficult time explaining how such a large, rocky world could develop. However, a new observational study suggests that it is not alone.

Also presenting at AAS, CfA astronomer Lars A. Buchhave found a correlation between the period of a planet (how long it takes to orbit its star) and the size at which a planet transitions from rocky to gaseous. This suggests that more mega-Earths will be found as planet hunters extend their data to longer-period orbits.

The discovery that Kepler-10c is a mega-Earth also has profound implications for the history of the universe and the possibility of life. The Kepler-10 system is about 11 billion years old, which means it formed less than 3 billion years after the Big Bang.

The early universe contained only hydrogen and helium. Heavier elements needed to make rocky planets, like silicon and iron, had to be created in the first generations of stars. When those stars exploded, they scattered these crucial ingredients through space, which then could be incorporated into later generations of stars and planets.

This process should have taken billions of years. However, Kepler-10c shows that the universe was able to form such huge rocks even during the time when heavy elements were scarce.

"Finding Kepler-10c tells us that rocky planets could form much earlier than we thought. And if you can make rocks, you can make life," says Sasselov.

This research implies that astronomers shouldn't rule out old stars when they search for Earth-like planets. And if old stars can host rocky Earths too, then we have a better chance of locating potentially habitable worlds in our cosmic neighborhood.

The HARPS-N project is led by the Astronomical Observatory of the Geneva University (Switzerland). The National Institute for Astrophysics (INAF, Italy) has agreed to provide 80 observing nights per year over five years to use HARPS-N coupled to the TNG. The U.S. partners are the Harvard-Smithsonian Center for Astrophysics and the Harvard University Origins of Life Initiative; and the UK partners are the Universities of St. Andrews and Edinburgh, and the Queens University of Belfast.

Headquartered in Cambridge, Mass., the Harvard-Smithsonian Center for Astrophysics (CfA) is a joint collaboration between the Smithsonian Astrophysical Observatory and the Harvard College Observatory. CfA scientists, organized into six research divisions, study the origin, evolution and ultimate fate of the universe.