Can the darkness of twilight – the photons of the cosmic microwave background – along with the light of dawn – the ultraviolet radiation emitted by the first stars that light up – unravel the story of reionization and tell us something about primordial inflation? We talk about it with Daniela Paoletti of the National Institute of Astrophysics, co-author of two new studies that try to answer this question
Almost a year later , Media Inaf met again Daniela Paoletti , researcher at Inaf Oas in Bologna and first author of a new study that investigates the history of reionization in an original way , the period – not yet well defined – in which the primordial gas, which was pervaded the universe in the early stages of its evolution, passes from the neutral to the ionized state. The work – with the evocative title “ Dark Twilight Joined with the Light of Dawn to Unveil the Reionization History”(The darkness of twilight together with the light of dawn to reveal the history of reionization) – presents an extensive analysis of the history of reionization based on recent cosmological and astrophysical data. Among the authors also Dhiraj Kumar Hazra , Fabio Finelli of Inaf Oas and the Nobel Prize for physics in 2006 George Smoot . In addition to this work, another article was presented in the same period that sees her among the authors, which reports a detailed study on what the latest Planck data have to say beyond the standard inflation model . On this hot summer day, we discover with her the details of the analysis, the implications and the hopes for the future.
A year after the publication of your study on the history of reionization, you are about to publish a new one on the same burning topic. What is it about?
“This is a study on how the darkness of twilight and the light of the dawn of the universe, together, can make us understand how one of the most important phases in the history of the universe could have gone. The article is the continuation of what we published a year ago in the journal Physical Review Letters in which we presented an original approach to study the history of the early universe, combining astrophysical data and the cosmic microwave background (or Cmb, from English cosmic microwave background ). In this article we are going to describe this approach in detail and to present many more things, compared to the previous article, which for reasons of space at the time we were not able to present ».
How did you come up with the idea of the title, so suggestive?
«The idea for the title came to me when I was preparing the seminar on Physical Review Letters . The innovative aspect of our approach is to bring together two completely different types of data: microwave background radiation and ultraviolet radiation. The first, which has always been defined as the first light of the universe , consists of the first photons that have been emitted which, however, think about it, are the same as the first twilight, because when this radiation has cooled down, the universe has entered that which is called dark era . While ultraviolet radiation represents a tracer of the first stars: it therefore traces the dawn of the universe, when it comes out of the night of the dark age. If the title had been in Italian I would have used the wordaurora , which in my opinion would be the most beautiful definition, but in English aurora and dusk are indicated in the same way. So I used dawn, as a terminology, because I liked this idea of combining twilight before night and light after night ».
Microwave radiation and ultraviolet radiation: how did you manage to reconcile such different data?
“They are two totally different types of data and precisely because they are so different we had to devise and develop this new technique which, instead of evaluating the fraction of ionized matter over time, solves the equation for the ionization fraction, thus succeeding to test even those that are the data of ultraviolet radiation, which otherwise we could not use in the classical approach. These data are telling us what is happening to the ionizing source, that is, to the first stars. Then we also use other data, which in this work proved to be very interesting: quasar data and gamma ray bursts (Grb) data. Some items at redshiftvery high, therefore very far away, they can tell us what the ionization situation is around the source, in their local world. If we assume that this is also representative of what is outside, they give us a precise idea of what is happening at that redshift at that moment ».
So is the method the same as presented in 2020?
«Yes, this is the basis of the method that we had already developed in the study presented in Physical Review Lettersin 2020, but in this case we went to see what happens when we start using different data or changing assumptions. The very first thing we did was to go and check what happens when in the reionization source we leave free a term that would otherwise be set by the simulations, because it is a term on which we have very little data and those we have are not very sensitive. . In the first work we had fixed it to the value of the simulations while now we have left it free to be guided by the data and we have seen that in reality above all the data of the quasars have a good ability to constrain it and that, fortunately, it turned out to be perfectly in agreement with the simulations. This therefore confirmed what we had previously assumed ».
What is the main novelty of the new study?
“An extremely interesting result of this new study is when we go to change the ultraviolet data. For ultraviolet brightness, what we measure with our instruments is the brightness function, which we then convert into ultraviolet radiation brightness density. As this brightness function needs to be integrated, we need to choose a cutoff magnitude. In other words, we do not consider sources that are weaker than the value assumed as a cut. Until now, we had always assumed a fairly conservative magnitude value of minus 17, given that for the weaker sources the data show a change in the behavior of the brightness function that we do not know if it is real or if it depends on the uncertainty on the data. We have now used a more aggressive, more optimistic cut instead. We wondered what would happen if we went to minus 15 . With this new cut we are in fact considering the contributions of sources that are so distant and so weak, but which are many and which therefore lead to a slightly different history of reionization. We note that we start to have a contribution to higher redshifts: instead of being an extremely steep climb, it becomes slower, which lasts longer precisely because we have a contribution from these very weak sources, capable of ionizing ».
Do you still have the doubt that you are not considering realistic sources?
«Yes, that always remains. Obviously the error bars get bigger because they are more difficult measures. But the beauty is that in October they launch Jwstwho will see very well all that queue that we are considering. I am thrilled to make predictions for Jwst because I am very curious to see what the impact will be as the error bars go down. Because if this contribution of the sources is really so great, the classical model that is used in cosmology – which foresees a very steep transition that lasts very little – begins to be problematic, because at that point in reality astrophysics would be telling us that reionization is a little slower. We must always take into account that reionization, beyond the importance in itself – because it is a phase transition, when the entire universe changes state completely – is fundamental in cosmology, because it represents one of our greatest uncertainties. Suffice it to say that the optical thickness – the parameter that concretizes the reionization in the cosmic microwave background – is the only one that, after Planck, has more than 1% of error. It therefore impacts a whole series of extended models, among which we have also demonstrated the inflation extension models, such as those in the other article we wrote ».
Will future experiments already planned be able to help in this sense?
“Yes, with the generation of cosmological experiments coming in the next few years, we need to be particularly careful about how reionization will be considered and the possibility of using this astrophysical data to make us tell how the history of reionization went could, cascade, also have an impact on the constraints on cosmological parameters. Furthermore, it is very interesting, also for the future, what we have shown on the data of quasars, which have proved to be very powerful because – although they extend less in redshift than ultraviolet radiation – they are precise points of the reionization fraction. George Smoot pointed out to me that actually in the future, with Desi and Euclid, we start talking about having no more five, six, ten points but bands of thousands of quasars. So in the next ten years the approach and perspective will also completely change ».
«An instrument that could have done exceptional things would have been Theseus, because the gamma ray bursts are very powerful: while the quasar has a continuous emission and therefore it ionizes the medium around itself, in the case of the Grb no because it is too fast. It doesn’t have time to ionize. It is precisely a precise point that indicates that fraction. Unfortunately, in the discussion we have used, we only have one of points, which however already shows how a single point out of the dozen points used is able to narrow the error bars ».
What astrophysical data are you using? At what distance?
“We use six galaxy clusters from the Hubble Frontier Field . The quasars used reach up to redshift 8 while the ultraviolet sources, interpolating, reach up to redshift 10 ».
There is a parallel study to this, submitted in the same days, with an equally curious title reminiscent of Toy Story. What is it about?
“The original title of this study by Dhiraj – the first author who was in Bologna until January 2020 while he is now a professor in Imsc Chennai (India) – was precisely” Inflation story: to slow-roll and beyond “(towards slow- roll and beyond, like the leitmotiv of Toy Story ) because the slow-roll is the standard model of inflation and to “go beyond.” Already that is the non plus ultra but we go further. That was the idea, quoting Buzz Lightyear. A couple of writers weren’t exactly in agreement because in fact slow-rollwe considered it as a starting point, so it would not have been correct to use the “to”. In the end, of course, we eliminated the “to” even if we both liked it because it resumed Buzz’s wanting to go further and, thinking about it, our end is similar to that of the Toy Story hero: we try, to go further , but there’s a problem. We know that the standard model is a beautiful fit to the data, but we also know that Planck confirmed what we had already seen with Wmap, namely that there are anomalies in this data. These anomalies are extremely elusive because they are all at 2.8 sigma of significance, when the threshold for saying that something is anomalous is 3 sigma. So we are somewhat in a limbo that does not allow us to understand if what we are seeing is a statistical fluctuation or is it really an anomaly ».
If it were an anomaly would it be more intriguing?
“Yeah, the nice thing is to go and see if it’s an anomaly. Two of the biggest anomalies are the lack of power at very large angular scales, which has been observed and confirmed by Planck, and small oscillations in the angular power spectrum. It is these oscillation blocks, or single oscillations, which are called features that are not produced in the standard model. One possibility might be that inflation wasn’t all slow-roll: we have a scalar field – the one that generates inflation – that moves very slowly on an extremely flat potential, which if before ending slowly rolling on the flat potential it had a little less flat potential, or had made a jump or a cusp, then it could lead to this loss of power and the generation of small oscillations ».
How do you check it?
“Usually what needs to be done, which has been dealt with in so many cases (including in Planck’s article on inflation), involves taking different physical models that go to see if the data is better or worse. The nice thing here is that we use this framework called Wiggly Whipped Inflation – literally, swinging whipped inflation . It is a phenomenological approach: we do not ask ourselves what caused that thing, but we ask ourselves: if the inflationary potential were done in a certain way, can we rent the data? Of course, if we fit the data better then we can be reasonably sure that we have found an inflationary model that works best. First there is the whip – the so-called potentialwhipped inflation which tells us that before the scalar field rolled a little faster, but then it starts to slow-roll . In this case I have a lack of power because when the head rolls quickly it does not generate many perturbations; it generates them when it arrives on slow-roll . So you have this lack of power. Then you go to test when the wiggles are also present, that is, these small fluctuations. These oscillations can be produced with discontinuities: when the potential has a jump, a wiggle is generated. If the jump is bigger, you generate a lot at certain scales, which depend on how much you jump. It is a general framework that simply gives us an idea of what can best fit the data ».
It sounds simple, but I guess it really isn’t …
There are two problems. On the one hand, we know that if we used 100,000 parameters we could fit the entire universe. I could write a 150-parameter dependent Universe Lagrangian and I would have the Universe Wave Function. But it doesn’t work like that because the degrees of freedom would be too many. So saying that a model improves the data is always a balancing act between how free the model is – how many degrees of freedom it has – and how much better the data is. If the model, as has happened to us in some of these cases, fits the data better than the standard model but uses many parameters, “it is not valid”. Furthermore, it must be said that Planck’s data from 2018 reduced the evidence of anomalies. In temperature there is still the loss of power,
What’s new in this second study?
“The novelty lies in the fact that we have also used polarization . Theoretically the same thing that is done in temperature can be done in polarization. The problem is that in the polarization, on the large angular scales, there is the reionization that increases the power, masking a possible drop in power. While temperature still favors the models that generate this power shortage, when considering polarization it actually becomes apparent that these models are not favored in any way over the standard slow-roll model with a power-law power spectrum. In addition to this, there is another novelty: going to use for the small angular scales (less than about 8 degrees) not the likelihoodPlanck official but CamSpec , the non-binned version, which takes into account all the single points without considering the averages. This likelihood is the result of a reworking of the Planck data made by George Efstathiou and Steven Gratton after the publication of the Planck results, which are able to use more sky, slightly improving the error bars. We wanted to use that because it is more complete and more evolved. At the moment there is no evidence in favor of these more particular models that respect a Lambda Cdm ».
What does the future hold on this front?
«The future will be very interesting for two reasons. The first is linked to the improvement of Cmb data thanks to LiteBird which will allow us to study the “E” polarization limited only by the cosmic variance. Then we will have the experiments on the ground, which will trace the small and medium scales where these oscillations are present, which can be seen in a more precise way than the Planck data. Furthermore, there is the large-scale structure: since this oscillation is also present at small scales, it can also be seen by an experiment like Euclid. Another possibility is to use non-Gaussianities, because the same effects of oscillations that are seen in the power spectrum are also seen in moments of higher order, therefore in non-Gaussianities ».
So, for now, any significant news on the history of inflation?
“For the moment, Planck’s data tells us that they still prefer a standard model. But the prospects for the future are good: in the next ten years I expect that we can really start to say whether it was just a standard slow-roll model or not. For now we are like Buzz: we can’t go much further than our standard model room but let’s remember that Buzz found his rocket too and took off for real and so it will be for us with future data, and maybe we will really go beyond slow roll “.
To know more:
- Read on arXiv the pre-print of the article ” Dark Twilight Joined with the Light of Dawn to Unveil the Reionization History ” by Daniela Paoletti, Dhiraj Kumar Hazra, Fabio Finelli, George F. Smoot
- Read on arXiv the pre-print of the article ” Inflation Story: slow-roll and beyond ” by Dhiraj Kumar Hazra, Daniela Paoletti, Ivan Debono, Arman Shafieloo, George F. Smoot, Alexei A. Starobinsky
Provided by INAF