Category Archives: Astronomy

Lack Of Massive Black Holes in Telescope Data is Caused by Bias (Astronomy)

Our telescopes have never detected a black hole more massive than twenty times the mass of the Sun. Nevertheless, we now know of their existence as dozens of those black holes have recently been “heard” to merge via gravitational wave radiation. A team of astronomers led by Peter Jonker (SRON/Radboud) has now discovered that these seemingly disparate results can be explained by biases against massive black holes in conventional telescope observations.

In 2015 the LIGO facilities detected gravitational waves for the first time. They were emitted by two massive black holes of several tens the mass of the Sun in the process of merging. This discovery shook the Universe, and also the astronomical community, because few astronomers had predicted that such massive black holes would exist, let alone that they could merge. Before the gravitational wave detections, our conventional telescopes had found proof for the existence of stellar mass black holes in about 20 cases. However, none had ever been found that were as massive as those now observed through gravitational wave radiation emitted during merger. By now about 50 of such merging black hole pairs have been detected, including by the European Virgo detector, again in most cases involving massive black holes. Telescopes still have not found such black holes.

This disparity can be partially explained by the larger volume of the Universe that is being probed by the gravitational wave detectors. LIGO-Virgo can find such more massive black holes more easily because their waves are stronger relative to those from lighter black holes, implying that these could be rare, but loud events. But zero detections of such black holes using telescopes? Black holes, or at least their close environment, lights up when they slowly devour a companion star. Through measurements of the orbital motion of the hapless star, the mass of the black hole can be determined.

A team of astronomers led by Peter Jonker (Radboud University/SRON) realized that telescope observations are biased against detecting massive black holes. Such massive black holes can, in principle, be observed if they eat mass from a companion star. However, the circumstances for those observations have been too difficult in practice, explaining the lack of detections of massive black holes through telescope observations. The largest black holes are formed through imploding massive stars, instead of exploding massive stars (“supernova”). Formed through an implosion, these massive black holes stay put in the same place where their predecessor (the massive star) was born, the plane of the Milky Way galaxy. However, that means that they remain shrouded in dust and gas. Their lighter black hole sisters and brothers, born out of massive stars through supernova explosions, experience a kick ejecting them out of the plane of the Milky Way, making them more readily observable for our telescopes measuring their mass.

Aggravating this bias, as realized by Jonker and colleagues, is that any companion star of a massive black hole must orbit at a relatively large distance, making it rarer for a companion star to be devoured in an observable frenzy. Such episodes are what gives away the existence and location of black holes. Thus, the more massive black holes will more rarely give away their location.

The imminent launch of the James Webb space telescope (JWST) on December 18 will allow astronomers to test these ideas. JWST will for the first time allow the measurement of the mass of several systems of candidate black holes in the plane of the Milky Way. JWST will be sensitive to infrared light, and such light is much less affected by dust and gas than is the optical light typically used by ground-based telescopes. Furthermore, the large size of JWST, and its advantageous position in space, allows JWST to pick out the right star to study among the millions of stars in the plane of the Milky Way. Finally, being above the Earth’s atmosphere, JWST will not be hindered by the infrared light emitted by the atmosphere.

Caption header image: Measurements using electromagnetic (EM) radiation only revealed stellar black holes less massive than about 20 solar masses (purple circles). These black holes all have a companion star that is losing mass to the black hole. This gas stream reveals the existence of the black hole and detailed study of the motion of the companion allows for the mass of the black hole to be measured. LIGO/Virgo measurements of gravitational wave radiation emitted when two black holes merge have allowed the masses of several tens of black holes to be measured since 2015 (blue circles). These black holes are generally more massive than those found through EM radiation. We know now that the lack of massive black holes studied through EM techniques can be caused by a bias against finding and studying the massive black holes. Incidentally, the LIGO/Virgo measurements favor the detection of massive black holes because the signal of their mergers is louder and thus can be detected from systems further out in the Universe compared with the signal of merging lower mass black holes. Nevertheless, LIGO/Virgo is also detecting lower-mass merging black holes. In the near- future the JWST telescope will enable to remove the EM bias. Due to its sensitivity astronomers will be able to measure the mass of black hole candidate systems located at places where the most massive black holes are thought to reside.

Publicatie

Peter G. Jonker, Karamveer Kaur, Nicholas Stone, and Manuel A. P. Torres, ‘The observed mass distribution of Galactic black hole LMXBs is biased against massive black holes’, The Astrophysical Journal


Provided by SRON

Simultaneous Readout of 60 Bolometers For Far-infrared Space Telescopes (Astronomy)

TES bolometers with simultaneous readout of multiple pixels are the candidate detector technology for a number of space missions for sub-millimeter and far-infrared wavelengths, such as LiteBIRD and OST. Qian Wang, a PhD student at SRON and RUG, has demonstrated a simultaneous readout of 60 TES bolometers. Publication in Applied Physics Letters.

Light with sub-millimeter and far-infrared wavelengths from deep space can travel long distances, penetrating right through dust clouds, and brings us information about the history of the universe and the origin of galaxies, stars and planets. However, the long journey has weakened these signals. So we require sensitive detectors operating at a millikelvin temperatures on a space instrument.

Transition edge sensor (TES) bolometers are superconducting detectors taking advantage of the collapse of the superconducting state and therefore a steep increase in the resistance when its temperature even slightly increases. So, their resistance is extremely sensitive to a change of the temperature, caused by the heating power from light. When heated by incoming photons, the tiny change of the temperature can produce measurable current responses in the detector.

Challenges of TES technology used in space missions are not only the sensitivity, but also reading out multiple pixels at the same time. Without this so-called multiplexing—combining the signals from many pixels into a single paired wire—their connecting wires for each pixel would generate too much heat, making it impossible to keep the detectors at the necessary temperature close to absolute zero.

Qian Wang, working closely with Pourya Khosropanah and other members of the SAFARI-FDM team at SRON, led by Gert de Lange, has demonstrated a frequency division multiplexing (FDM) system that can read out 60 TES bolometers simultaneously using only a single paired wire and an amplifier.  The readout noise is lower compared to previous work reported at SRON and by other laboratories, down to a Noise Equipment Power of 0.45 aW/ÖHz. The sensitivities measured in the multiplexing working mode are the same as in a single pixel mode. The researchers expect to read out at least 130 pixels simultaneously if they extend the frequency range used for the current FDM setup. The result demonstrates that the readout technology meets the requirements of the Japanese LiteBIRD space mission and that FDM technology is an option for NASA’s OST mission in the long-term.

Caption header image: Part of a TES bolometer array. Each pixel consists of a TES thermometer and thin tantalum absorber.

Publication

Q. Wang, P. Khosropanah, J. van der Kuur, G. de Lange, M. D. Audley, A. Aminaei, M. L. Ridder, A. J. van der Linden, M. P. Bruijn, F. van der Tak, and J. R. Gao, Frequency division multiplexing readout of 60 low-noise transition-edge sensor bolometers, Appl. Phys. Lett. 119, 182602 (2021); https://doi.org/10.1063/5.0065570


Provided by SRON Netherlands Institute for Space Research

World’s Most Powerful Telescope Keeps Eyes On The Skies (Astronomy)

Australian astronomers hunting for Earth-like planets outside our solar system and investigating the dramatic life and death of stars have been given a major boost.

The newly released 2020 United States Decadal Survey on Astronomy and Astrophysics has listed the revolutionary Giant Magellan Telescope (GMT) as one of its top priorities. The recommendation opens the door for major funding for the construction of the telescope.

The GMT, being built by an international team including researchers from The Australian National University (ANU), is one of the most exciting projects in the world of astronomy.

Sited in the high, dry Atacama desert in Chile, one of the best places on the planet for exploring the Universe, the GMT represents a new generation in ground-based extremely large telescopes. The telescope will use seven of the world’s largest mirrors and the most advanced optics technology to see billions of light years away.

ANU is a major partner in the GMT project and researchers from the University are involved in the design of the telescope’s adaptive optics system, which removes atmospheric blur and gives GMT the sharpest possible images.

Professor Matthew Colless from ANU, a long-time member of the international GMT team, said Australian researchers were a playing a major role “in making the GMT’s unrivalled vision a reality”.

“ANU scientists are designing and building an integral-field spectrograph for the GMT. This will allow scientists to take images and measure spectra so we can better understand supermassive black holes, the birth and evolution of galaxies, and search for planets outside our solar system,” Professor Colless said.

“ANU researchers are also working with fellow Australian scientists to design and build an optical fibre system that will give GMT the widest field of view of any of the giant new telescopes.

“All of this is a major benefit to Australian research and researchers, and will ensure we remain a global leader when it comes to astronomy and astrophysics.”

Mark McAuley, Chief Executive Officer of Astronomy Australia Limited (AAL), said the US announcement would deliver major benefits for Australian researchers.

“This is another example of Australia playing an active role in a major international research project,” he said.

“Our scientists and engineers will continue to design and construct significant components of this incredible telescope, and Australian astronomers will be among the first in line to access the Giant Magellan Telescope and all its possibilities.”

ANU Vice-Chancellor, Professor Brian Schmidt, who won the 2011 Nobel Prize in physics for his work on the expanding Universe, welcomed the US announcement and congratulated the Giant Magellan Telescope consortium.

“My own work on the expanding Universe will benefit significantly from the power of the GMT,” Professor Schmidt said. “The GMT is an incredible innovation; it could one day be the basis of work that leads to another Nobel Prize.

“It’s really encouraging to see the US astronomy community recommending the GMT for major funding by the US Government. It will take the construction of the GMT to the next level and help us unlock a Universe’s worth of discovery.”

ANU and AAL are equal partners in the Giant Magellan Telescope project.

Astronomy Australia Limited (AAL) is a non-profit organisation, whose members are Australian universities and research organisations with a significant astronomical research capability.

About the Giant Magellan Telescope 
The Giant Magellan Telescope is the work of the GMTO Corporation, an international nonprofit organisation headquartered in Pasadena, California. The consortium’s mission is to design, build, and operate the Giant Magellan Telescope to enable cutting-edge scientific observations that will revolutionise humanity’s fundamental understanding of the Universe.

Featured image: The Giant Magellan Telescope © GMTO Corporation


Provided by ANU

Scientists Detect A “Tsunami” of Gravitational Waves (Astronomy)

A team of international scientists, including researchers from The Australian National University (ANU), have unveiled the largest number of gravitational waves ever detected.

The discoveries will help solve some of the most complex mysteries of the Universe, including the building blocks of matter and the workings of space and time.

The global team’s study, published today on ArXiv, made 35 new detections of gravitational waves caused by pairs of black holes merging or neutron stars and black holes smashing together, using the LIGO and Virgo observatories between November 2019 and March 2020.

This brings the total number of detections to 90 after three observing runs between 2015 and 2020.

The new detections are from massive cosmic events, most of them billions of light years away, which hurl ripples through space-time. They include 32 black hole pairs merging, and likely three collisions between neutron stars and black holes.

ANU is one of the key players in the international team making the observations and developing the sophisticated technology to hunt down elusive gravitational waves across the vast expanse of the Universe. 

Distinguished Professor Susan Scott, from the ANU Centre for Gravitational Astrophysics, said the latest discoveries represented “a tsunami” and were a “major leap forward in our quest to unlock the secrets of the Universe’s evolution”.

“These discoveries represent a tenfold increase in the number of gravitational waves detected by LIGO and Virgo since they started observing,” Distinguished Professor Scott said.  

“We’ve detected 35 events. That’s massive! In contrast, we made three detections in our first observing run, which lasted four months in 2015-16.

“This really is a new era for gravitational wave detections and the growing population of discoveries is revealing so much information about the life and death of stars throughout the Universe.

“Looking at the masses and spins of the black holes in these binary systems indicates how these systems got together in the first place.

“It also raises some really fascinating questions. For example, did the system originally form with two stars that went through their life cycles together and eventually became black holes? Or were the two black holes thrust together in a very dense dynamical environment such as at the centre of a galaxy?”

Distinguished Professor Scott, who is also a Chief Investigator of the ARC Centre of Excellence for Gravitational Wave Discovery (OzGrav), said the continual improvement of gravitational wave detector sensitivity was helping drive an increase in detections.

“This new technology is allowing us to observe more gravitational waves than ever before,” she said.

“We are also probing the two black hole mass gap regions and providing more tests of Einstein’s theory of general relativity.

“The other really exciting thing about the constant improvement of the sensitivity of the gravitational wave detectors is that this will then bring into play a whole new range of sources of gravitational waves, some of which will be unexpected.”

Featured image: Two black holes merge to become one © NASA


Provided by Australian National University

Magnetic Manipulation of Space Junk (Astronomy)

The solution to remotely deflecting potentially dangerous space debris, or to slowing damaged satellites into uncontrolled rotation in order to repair them, is called magnetic manipulation. In particular, the one proposed by the University of Utah exploits the currents induced by moving magnetic fields in the metal pieces constituting the space debris. All the details on Nature

They call it space junk, but it looks more like a volley of bullets. 27 thousand debris, the size of tennis balls that travel up to 28 thousand kilometers per hour: many would be over our heads, according to NASA. And this (of) discharge risks causing serious damage to satellites or spacecraft that are still functioning.

As on Earth, the garbage problem becomes more and more burdensome as the population (of orbiting objects, in the case of space) increases. The problem, in this case, is to understand how to handle it properly and – possibly – how to recover it where it is possible to repair the damage. A study published in Nature has discovered a new method to deflect orbiting debris without touching it through the use of rotating magnets .

Password: deviate but do not touch. The advantage of magnetic manipulation is precisely the possibility of performing it without contact, avoiding dangerous and destructive collisions between manipulator object and target. The concept is well proven and even reaches six degrees of freedom of movement in the case of ferromagnetic material . In the case of space debris , however, the metal of which they are composed is electrically conductive but does not contain an appreciable amount of ferromagnetic material.

When metal debris is subjected to a time-varying magnetic field , however, the electrons circulate inside in circular cycles, generating eddy currents that interact with the magnetic field itself. In the new study, the scientists showed that this physical feature allows the manipulation, with six degrees of freedom, of conductive objects by means of rotating magnets: the process transforms the debris into a real electromagnet, whose motion can then be controlled remotely. .

To be honest, the idea of ​​using induced magnetic currents to move objects in space is not new either, but until now it was limited to just one degree of freedom, such as a horizontal thrust. By using multiple sources of magnetic field in a coordinated way, however, the researchers figured out how to move objects in six degrees of motion, including rotation.

“What we wanted to do was manipulate it, not just push it, just like we do on Earth,” says Jake J. Abbott , professor of mechanical engineering at the University of Utah and head of the team that devised this new method. “This form of skillful manipulation has never been done before.”

With this new technique, it would be possible, for example, to stop a damaged satellite in uncontrolled rotation to repair it, a maneuver that – under normal conditions – would be highly risky. “You have to take this crazy object floating in space, and bring it to a position where it can be fixed by a robotic arm,” explains Abbott. “But if it’s spinning out of control, it could break the robot arm, creating even more debris.”

This method also allows you to manipulate particularly fragile objects. While a robotic arm could damage an object by applying excessive force, these magnets would apply a softer force to the entire object so that no sections are damaged.

To test the new technique, the team used a series of magnets to move a copper sphere on a plastic raft in a water tank (the best way to simulate slow objects in microgravity). The magnets not only moved the sphere into a square, but they also managed to rotate it.

“NASA is tracking thousands of space debris in the same way air traffic controllers track planes. You have to know exactly where they are to avoid accidentally crashing into them, ”says Abbott. “The US government and the governments of the world know about this problem because there is more and more stuff piling up with each passing day.”

To know more:

Watch the YouTube video of the University of Utah:

Featured image: 70 percent of all cataloged objects are in a low Earth (Leo) orbit, extending up to 2000 kilometers above the Earth’s surface. To observe the Earth, the probes must orbit at such a low altitude. The spatial density of objects increases at high latitudes. Note that the debris field shown in the image is an artist impression based on real data. However, the image does not show the debris in its true size or density – the debris is shown in an exaggeratedly large size to make it visible at the scale shown. Credits: Esa


Provided by INAF

Supporting Life Beyond Earth Could Be Possible Thanks To Graphene Innovation (Astronomy)

Advanced manufacturing experts from Manchester have revealed what human life in space could look like—with a graphene-enhanced space habitat developed to meet anticipated demand for human settlements beyond Earth.

A community of specialists at The University of Manchester have teamed up with global architect firm Skidmore, Owings & Merrill (SOM) to research the design and manufacturing of space habitats for the space industry.

With projections that the global space economy could grow to $1 trillion revenue by 2040, the innovation will raise the technology readiness level (TRL) of new lightweight composites using 2D materials for space applications.

In an international collaboration, Dr. Vivek Koncherry and his team—supported by the Manchester-based Graphene Engineering Innovation Centre—are creating a scaled prototype of a space habitat with pressurized vessels designed to function in a space environment.

SOM, the architects behind the world’s tallest building—Burj Khalifa in Dubai—are contributing design and engineering expertise to the space architecture. Daniel Inocente, SOM’s senior designer in New York, said that “designing for habitation in space poses some of the greatest challenges—it means creating an environment capable of maintaining life and integrating crew support systems.

Supporting life beyond earth could be possible -- thanks to graphene innovation
The view from inside the viewing deck aboard the Graphene Space Habitat—the image shows a child passenger with the earth below. Credit: SOM, Luxigon, and the University of Manchester

“As architects, our role is to combine and integrate the most innovative technologies, materials, methods and above all the human experience to designing inhabited environments,” added Inocente. “Conducting research using graphene allows us to test lightweight materials and design processes that could improve the efficacy of composite structures for potential applications on Earth and future use in space.”

In the next five to 10 years most governments are expected to want a permanent presence in space to manage critical infrastructure, such as satellite networks—as well as considering the potential opportunity of accessing space-based resources and further scientific exploration.

Dr. Koncherry said: “A major barrier to scaling up in time to meet this demand is the lack of advanced and automated manufacturing systems to make the specialist structures needed for living in space. One of the space industry’s biggest challenges is overcoming a lack of robotic systems to manufacture the complex shapes using advanced materials.”

The solution is incorporating graphene for advanced structural capabilities, such as radiation shielding, as well as developing and employing a new generation of robotic machines to make these graphene-enhanced structures. This technology has the potential to revolutionize high-performance lightweight structures—and could also be used for terrestrial applications in the aerospace, construction and automotive sectors.

Supporting life beyond earth could be possible -- thanks to graphene innovation
The Graphene Space Station in low earth orbit—this image shows the profile of the vessel that is made up of a collection of capsules, each housing different activities and personnel. The top of the vessel features a viewing deck that offers a unique perspective of earth and our cosmos. An Orion space shuttle is also shown flying in the background as this type of space vehicle would transport people and supplies to the Graphene Space Station. Credit: SOM, Luxigon, and the University of Manchester”?

James Baker, CEO Graphene@Manchester, says that “the work being led by Dr. Koncherry and his colleagues is taking the development of new composites and lightweighting to another level, as well as the advanced manufacture needed to make structures from these new materials. By collaborating with SOM there are opportunities to identify applications on our own planet as we look to build habitats that are much smarter and more sustainable.”

The space habitat launch coincides with a series of world firsts for graphene in the built environment currently happening here on Earth—including the first external pour of graphene-enhanced Concretene and pioneering A1 road resurfacing—all supported by experts in the city where the super strong material was first isolated.

Tim Newns, Chief Executive of MIDAS, Manchester’s inward investment agency, said that “this exciting piece of research further underlines the breadth of applications where advanced materials and in particular graphene can revolutionize global industries such as the space industry. In addition to world-leading expertise in graphene, facilities such as the new Advanced Machinery & Productivity Institute (AMPI) in Rochdale, will also support the development of advanced machines and machinery required to bring these applications to reality.”

Featured image: The Graphene Space Station in low earth orbit—this image shows the top of the viewing deck with its protective petal-like shields fully open to allow observers to have a unique perspective of earth and our cosmos. Credit: The University of Manchester, SOM (Skidmore, Owings & Merrill and Luxigon


Provided by University of Manchester

On the trail of ultra-low gravitational waves (Astronomy)

Analyzed in detail a promising signal that could be due to the so-called gravitational wave background produced by the gravitational energy released by pairs of supermassive black holes in mutual approach. The study, which was also attended by researchers from INAF, represents a step forward on the road to the detection of gravitational waves of very low frequency, of the order of one billionth of a Hertz.

The collaboration Epta (European Pulsar Timing Array) published today in Monthly Notices of the Royal Astronomical Society an article in which the detailed analysis of a promising signal is reported that could be due to the so-called gravitational wave background ( Gwb ), due to astronomers around the world have been chasing for some time, produced by the gravitational energy released by pairs of supermassive black holes as they approach each other, eventually leading them to merge. The results of the study were made possible thanks to the pulsar data collected, in twenty-four years of observations, with five large-aperture European radio telescopes – including the 64-meter-diameter Sardinia Radio Telescope ( Srt ), located near Cagliari.

The beams of radiation emitted by the magnetic poles of pulsars rotate with the star, and we observe them as radio pulses as they cross our line of sight, like the beams of light from a distant beacon. Pulsar Timing Arrays ( Pta ) are made up of an array of pulsars that have a very stable rotation, and for this property they are used as gravitational wave detectors on a galactic scale. In the presence of a gravitational wave, space-time is in fact deformed and the very regular cadence of the radio pulses of a pulsar is therefore in turn altered. PTAs are sensitive to very low frequency gravitational waves, in the one-billionth hertz regime: a gravitational wave of this type makes a single oscillation in about 30 years.

The Pta are therefore able to widen the window of observability of gravitational waves, currently limited only to the high frequencies (of the order of hundreds of hertz), which are studied by the detectors on the ground Ligo, Virgo and Kagra. These instruments are able to pick up the gravitational signals generated in short-lived collisions involving stellar-mass black holes and neutron stars, while PTAs can detect gravitational waves produced by binary systems of supermassive black holes located in the center of galaxies. during their slow spiraling motion of mutual approach. The cumulative effect of the signals produced by this population of extreme celestial objects is, in fact, the background of gravitational waves.

“The presence of a gravitational wave background”, explains Andrea Possenti , researcher at INAF in Cagliari and co-author of the work, “manifests itself in the form of very low frequency fluctuations in the rhythm of radio pulses coming from all pulsars, a sort of Additional “noise” that disturbs the regular pulse pattern, which we could otherwise compare to the ticking of a very precise clock. Speaking in a very simplified way, an experiment such as the one conducted by Epta therefore consists in the repeated observation of the array of pulsars, every few weeks and for many years, in search of a very low frequency “noise” that afflicts their ticking in a common way. all pulsars, and that it is not attributable to causes other than gravitational waves ».

Artist’s impression of the Epta experiment. A group of European radio telescopes observed a network of pulsars spread across the sky. The variation recorded in the time of arrival on Earth of the radio pulses emitted by these celestial bodies allows astronomers to study the smallest perturbations of space-time. These perturbations, called gravitational waves, propagate relentlessly from the most remote and therefore oldest borders of the universe, as the first galaxies merged with each other and the supermassive black holes housed in their central regions orbited each other. and they produced them © INAF

In fact, the expected amplitude of the “noise” due to the gravitational background is incredibly small, from a few tens to a couple of hundred billionths of a second of advance or delay in the arrival times of the radio pulses: in principle many other effects they could induce a similar “noise”. In order to reduce the role of the other sources of perturbation and validate the results, the analysis of the data collected by Epta’s measurements therefore made use of two completely independent procedures, with three different modeling of the corrections due to the bodies of the Solar System, and adopting different statistical treatments. This allowed the team to pinpoint a clear signal that could potentially be identified as belonging to the gravitational wave background. Particularly,

“Epta had already found indications of the presence of this signal in the data set published in 2015,” recalls Nicolas Caballero , researcher at the Kavli Institute for Astronomy and Astrophysics in Beijing and lead co-author of the publication. “Since the results were then affected by large statistical uncertainties, they were strictly discussed only as upper limits for the amplitude of the signal. Our new data now clearly confirms the presence of this signal common to all pulsars, making it a candidate for the gravitational wave background. ‘

The general relativity Einstein predicts very specific relationship between the deformation of space-time experienced by radio signals from pulsars located in different directions in the sky. Scientists call this the “spatial correlation” of the signal. Its detection will uniquely identify the observed noise as due to a gravitational wave background. “At the moment, the statistical uncertainties in our measurements do not yet allow us to identify the presence of the predicted spatial correlation for the background signal of gravitational waves. To confirm the nature of the signal, ”explains Siyuan Chen , researcher at Lpc2Eof the French CNRS in Orleans, first author of the study, “we therefore need to include more pulsar data in the analysis. However, we can say that the current results are very encouraging ».

The Cagliari team that participated in the study. From left: Andrea Possenti, Marta Burgay and Delphine Perrodin. Credits: G. Alvito, P. Soletta / Inaf Cagliari

Epta is a founding member of the International Pulsar Timing Array ( Ipta ). Since the independent data analyzes performed by the other IPTA partners – i.e. the NanoGrav and Ppta experiments – have also indicated the presence of a similar signal, the IPTA members are working together to better prepare the next steps, thanks to progress. obtained by comparing all their data and methods of analysis.

“As it was for high frequency gravitational waves in 2015, the detection of very low frequency gravitational waves would be an epochal achievement for physics, astrophysics and cosmology”, concludes Delphine Perrodin, researcher at the INAF of Cagliari and co-author of the work. «In particular, the discovery and study of the gravitational wave background will give us direct information on the size and evolution of supermassive black holes, and on their contribution in shaping galaxies and the current universe. A challenge in which INAF has been immersed since 2006, the year of the birth of the Epta collaboration, and which now makes use of the asset represented by Srt and its involvement as part of Leap, the Large European Array for Pulsars, in which the Epta’s telescopes work in a synchronized way to reach the capabilities of a single 200 meter diameter antenna, and thus greatly increase Epta’s sensitivity to gravitational waves ».

Featured image: The five large European radio telescopes used in this study. From top left: The Effelsberg radio telescope in Germany, the Nancay radio telescope in France, the Sardinia Radio Telescope in Italy, the Westerbork Synthesis Radio Telescope in the Netherlands and the Lovell Telescope in the United Kingdom © INAF


To know more:


Provided by INAF

Making Martian Rocket BioFuel on Mars (Astronomy)

Researchers have developed a concept that would make Martian rocket fuel, on Mars, that could be used to launch future astronauts back to Earth.

Researchers at the Georgia Institute of Technology have developed a concept that would make Martian rocket fuel, on Mars, that could be used to launch future astronauts back to Earth.

The bioproduction process would use three resources native to the red planet: carbon dioxide, sunlight, and frozen water. It would also include transporting two microbes to Mars. The first would be cyanobacteria (algae), which would take COfrom the Martian atmosphere and use sunlight to create sugars. An engineered E. coli, which would be shipped from Earth, would convert those sugars into a Mars-specific propellant for rockets and other propulsion devices. The Martian propellant, which is called 2,3-butanediol, is currently in existence, can be created by E. coli, and, on Earth, is used to make polymers for production of rubber.  

The process is outlined in a paper, “Designing the bioproduction of Martian rocket propellant via a biotechnology-enabled in situ resource utilization strategy,” published in the journal Nature Communications.

Rocket engines departing Mars are currently planned to be fueled by methane and liquid oxygen (LOX). Neither exist on the red planet, which means they would need to be transported from Earth to power a return spacecraft into Martian orbit. That transportation is expensive: ferrying the needed 30 tons of methane and LOX is estimated to cost around $8 billion. To reduce this cost, NASA has proposed using chemical catalysis to convert Martian carbon dioxide into LOX, though this still requires methane to be transported from Earth.

As an alternative, Georgia Tech researchers propose a biotechnology based in situ resource utilization (bio-ISRU) strategy that can produce both the propellant and LOX from CO2. The researchers say making the propellant on Mars using Martian resources could help reduce mission cost. Additionally, the bio-ISRU process generates 44 tons of excess clean oxygen that could be set aside to use for other purposes, such as supporting human colonization.

“Carbon dioxide is one of the only resources available on Mars. Knowing that biology is especially good at converting CO2 into useful products makes it a good fit for creating rocket fuel,” said Nick Kruyer, first author of the study and a recent Ph.D. recipient from Georgia Tech’s School of Chemical and Biomolecular Engineering (ChBE).

The paper outlines the process, which begins by ferrying plastic materials to Mars that would be assembled into photobioreactors occupying the size of four football fields. Cyanobacteria would grow in the reactors via photosynthesis (which requires carbon dioxide). Enzymes in a separate reactor would break down the cyanobacteria into sugars, which could be fed to the E. coli to produce the rocket propellant. The propellant would be separated from the E. coli fermentation broth using advanced separation methods.

The team’s research finds that the bio-ISRU strategy uses 32% less power (but weighs three times more) than the proposed chemically enabled strategy of shipping methane from Earth and producing oxygen via chemical catalysis.

Because the gravity on Mars is only a one-third of what is felt on Earth, the researchers were able to be creative as they thought of potential fuels.

“You need a lot less energy for lift-off on Mars, which gave us the flexibility to consider different chemicals that aren’t designed for rocket launch on Earth,” said Pamela Peralta-Yahya, a corresponding author of the study and an associate professor in the School of Chemistry & Biochemistry and ChBE who engineers microbes for the production of chemicals. “We started to consider ways to take advantage of the planet’s lower gravity and lack of oxygen to create solutions that aren’t relevant for Earth launches.”

“2,3-butanediol has been around for a long time, but we never thought about using it as a propellant. After analysis and preliminary experimental study, we realized that it is actually a good candidate,” said Wenting Sun, associate professor in the Daniel Guggenheim School of Aerospace Engineering, who works on fuels.

The Georgia Tech team spans campus. Chemists, chemical, mechanical, and aerospace engineers came together to develop the idea and process to create a viable Martian fuel. In addition to Kruyer, Peralta-Yahya, and Sun, the group included Caroline Genzale, a combustion expert and associate professor in the George W. Woodruff School of Mechanical Engineering, and Matthew Realff, professor and David Wang Sr. Fellow in ChBE, who is an expert in process synthesis and design.

The team is now looking to perform the biological and materials optimization identified to reduce the weight of the bio-ISRU process and make it lighter than the proposed chemical process. For example, improving the speed at which cyanobacteria grows on Mars will reduce the size of the photobioreactor, significantly lowering the payload required to transport the equipment from Earth.

“We also need to perform experiments to demonstrate that cyanobacteria can be grown in Martian conditions,” said Realff, who works on algae-based process analysis. “We need to consider the difference in the solar spectrum on Mars both due to the distance from the Sun and lack of atmospheric filtering of the sunlight. High ultraviolet levels could damage the cyanobacteria.”

The Georgia Tech team emphasizes that acknowledging the differences between the two planets is pivotal to developing efficient technologies for the ISRU production of fuel, food, and chemicals on Mars. It’s why they’re addressing the biological and materials challenges in the study in an effort to contribute to goal of future human presence beyond Earth.

“The Peralta-Yahya lab excels at finding new and exciting applications for synthetic biology and biotechnology, tackling exciting problems in sustainability,” added Kruyer. “Application of biotechnology on Mars is a perfect way to make use of limited available resources with minimal starting materials.”

The research was supported by a NASA Innovative Advanced Concepts (NIAC) Award.

Citation: Kruyer, et al. “Designing the bioproduction of Martian rocket propellant via a biotechnology-enabled in situ resource utilization strategy” Nature Communications. 10.1038/s41467-021-26393-7.

Featured image: Artist’s conception of astronauts and human habitats on Mars. Courtesy: NASA


Provided by Georgia Institute of Technology

Breakthrough Listen Releases Analysis Of Previously Detected Signal (Astronomy)

Findings Published in Nature Astronomy; Publicly Available at seti.berkeley.edu/blc1.

An intriguing candidate signal picked up last year by the Breakthrough Listen project has been subjected to intensive analysis that suggests it is unlikely to originate from the Proxima Centauri system. Instead, it appears to be an artifact of Earth-based interference from human technologies, the Breakthrough Initiatives announced today. Two research papers, published in Nature Astronomy, discuss both the detection of the candidate signal and an advanced data analysis process that can finely discern “false positives.”

“The significance of this result is that the search for civilizations beyond our planet is now a mature, rigorous field of experimental science,” said Yuri Milner, founder of Breakthrough Inititatives.

Breakthrough Listen (a program of the Breakthrough Initiatives) is an astronomical science program searching for technosignatures – signs of technology that may have been developed by extraterrestrial intelligence. Listen’s science team, led by Dr. Andrew Siemion at the University of California, Berkeley, uses some of the largest radio telescopes in the world, equipped with the most capable digital processing systems, to capture data across broad swaths of the radio spectrum in the direction of a wide range of celestial targets. The search is challenging because Earth is awash with radio signals from human technology – cell phones, radar, satellites, TV transmitters, and so on. Searching for a faint signal from a distant star is akin to picking out a needle in a vast digital haystack – and one that is changing constantly over time.

The CSIRO Parkes Telescope in New South Wales, Australia (one of the largest telescopes in the Southern Hemisphere, known as ‘Murriyang’ in Wiradjuri) is among the facilities participating in Breakthrough Listen’s search. One of the targets being monitored by Parkes is Proxima Centauri, the Sun’s nearest neighboring star, at a distance of just over 4 light years. The star is a red dwarf orbited by two known exoplanets. The Listen team scanned the target across a frequency range of 700 MHz to 4 GHz, with a resolution of 3.81 Hz – in other words, performing the equivalent of tuning to over 800 million radio channels at a time, with exquisite detection sensitivity.

Shane Smith, an undergraduate researcher working with Listen Project Scientist Dr. Danny Price in the summer 2020 Breakthrough Listen internship program, ran the data from these observations through Breakthrough Listen’s search pipeline. He detected over 4 million “hits” – frequency ranges that had signs of radio emission. This is actually quite typical for Listen’s observations; the vast majority of these hits make up the haystack of emissions from human technology.

As with all of Listen’s observations, the pipeline filters out signals which look like they are unlikely to be coming from a transmitter at a large distance from Earth, according to two main criteria:

  • Firstly, is the signal steadily changing in frequency with time? A transmitter on a distant planet would be expected to be in motion with respect to the telescope, leading to a Doppler drift akin to the change in pitch of an ambulance siren as it moves relative to an observer. Rejecting hits with no such signs of motion reduces the number of hits from 4 million to around 1 million for this particular dataset.
  • Secondly, for the hits that remain, do they appear to be coming from the direction of the target? To determine this, the telescope points in the direction of Proxima Centauri, and then points away, repeating this “ON – OFF” pattern several times. Local interfering sources are expected to affect both ON and OFF observations, whereas a candidate technosignature should appear only in the ON observations.

Even after both of these data filters are applied, a handful of candidates remain that must be inspected visually. Sometimes a faint signal is actually visible in the OFF observations but is not quite strong enough to be picked up by automated algorithms. Sometimes similar signals appear in neighboring observations, indicative of interfering sources that may be turning on and off at just the wrong period, or the team can track down the signals to satellites that commonly broadcast in certain frequency bands.

Occasionally an intriguing signal remains and must be subjected to further checks. Such a signal-of-interest was discovered by Smith in Listen’s observations of Proxima Centauri using the Parkes telescope. A narrow-band, Doppler-drifting signal, persisting over five hours of observations, that appears to be present only in “ON” observations of the target star and not in the interspersed “OFF” observations, had some of the characteristics expected from a technosignature candidate.

Dr. Sofia Sheikh, currently a postdoctoral researcher with the Listen team at UC Berkeley, dug into a larger dataset of observations taken at other times. She found around 60 signals that share many characteristics of the candidate, but are also seen in their respective OFF observations.

“We can therefore confidently say that these other signals are local to the telescope and human-generated,” says Sheikh. “The signals are spaced at regular frequency intervals in the data, and these intervals appear to correspond to multiples of frequencies used by oscillators that are commonly used in various electronic devices. Taken together, this evidence suggests that the signal is interference from human technology, although we were unable to identify its specific source. The original signal found by Shane Smith is not obviously detected when the telescope is pointed away from Proxima Centauri – but given a haystack of millions of signals, the most likely explanation is still that it is a transmission from human technology that happens to be ‘weird’ in just the right way to fool our filters.”

Executive Director of the Breakthrough Initiatives Dr. S. Pete Worden remarked, “While we were unable to conclude a genuine technosignature, we are increasingly confident that we have the necessary tools to detect and validate such signatures if they exist.”

Breakthrough Listen is making all of the data from the Parkes scans available to the public to examine for themselves. The team has also just published two papers (led by Smith and Sheikh) outlining the details of the data acquisition and analysis, and a research note describing follow-up observations of Proxima Centauri conducted with the Parkes Telescope in April 2021. Listen will continue monitoring of Proxima Centauri, which remains a compelling target for technosignature searches, using a suite of telescopes around the world. And the team continues to refine algorithms to improve their ability to discriminate between “needles” and “hay”, including as part of a recently-completed crowdsourced data processing competition in collaboration with kaggle.com.

“In the case of this particular candidate,” remarks Siemion, “our analysis suggests that it’s highly unlikely that it is really from a transmitter out at Proxima Centauri. However, this is undoubtedly one of the most intriguing signals we’ve seen to date.”

Preprints of the papers, links to the data and associated software, artwork, videos, and supplementary content may be accessed at seti.berkeley.edu/blc1.

Featured image: Artist’s impression of the Proxima Centauri system. Credit: Breakthrough Listen / Zayna Sheikh


Provided by Breakthrough Initiatives