Seeing through Dark Matter with gravitational waves

We covered the topic of dark matter before in this post (Dark Matter under the microscope). Dark matter remains one of the bigegst mysteries of Science. One of the candidates for dark matter are Primordial Black Holes or PBH. PBH are black holes that formed during the first instants of the universe. Like dark matter, PBH do not emit light and interact with the rest of the universe basically only through gravity. The LIGO experiment has been detecting a surprisingly high number of massive black holes. The origin of these black holes is uncertain but one of the possibilities is that they could be PBH. We also discussed LIGO detections in this earlier post (Did LIGO really see massive black holes?) . In order to explain the current observations by LIGO, only a fraction of the dark matter needs to be in the form of PBH. In particular, a fraction as small as 1% of the total dark matter would be sufficient to explain the unusually elevated rate of black hole mergers with masses above 20 solar masses.

In a new work we discuss a novel method to explore the possibility that PBH constitute part of the dark matter. Our latest paper (see link at the end of this post) studies for the first time the interference produced when gravitational waves cross a portion of the sky populated with a realistic distribution of stellar bodies (stars, neutron stars or black holes) or microlenses. Earlier work have considered only the simple, but unrealistic, case of isolated microlenses and at most assuming that they are located near a larger lens (galaxy or cluster) but always on the side with positive parity (a tecnicallity that describes one of the two possible configurations for a lensed image). Our work goes further than these simple exmaples by studying the combined effect produced by a realustic population of microlenses and also considers the unexplored regime of macroimages with negative parity (they constitute roughly half  the images produced in the string lensing regime). The figure accompanying this post shows an example of a single microlens embeded in a macrolens and on the side of the lens plane with negative parity. The numbers in orange represent relative time delays (in milliseconds) between the different microimages (the numbers in white indicate the magnification of each microimage and the grey scale shows the magnification in the lens plane with the critical curves shown as two white circular regions. The inset in the bottom-right shows the corresponding magnification in the source plane with the position of two sources, one white and one yellow). At LIGO frequencies (approx 100-500 Hz), a time delay between 1/500 seconds or 1/100 seconds (that is or 2 or 10 milliseconds  respectively)  can produce constructive or destructive interference in the incoming gravitational wave at the detector. For the example in the figure, the microlens has a mass of 100 solar masses. These type of masses where known before to be capable of producing such interference but what our work show is that the mass can still be significantly smaller (a few solar masses) provided several microlenses can work together to produce time delays of order several milliseconds. This cooperative behaviour takes place naturally when one is observing gravitational waves that are being lensed by large factors (of order 100 or more) since in this case, two microlenses which are relatively distant from each other in the lens plane, can overlap their regions of high magnification (known as caustics) in the source plane, if the magnification from the macromodel (galaxy or cluster) is sufficiently large (in a fashion similar to how a magnifying glass works that can bring photons that are separated by some relatively large distance to come together at the focal point of the magnifying glass). Our study shows that interference of a gravitational wave with itself due to microlenses is not only possible, but unavoidable if the magnification from the macromodel is sufficiently large.

This result opens the door to constrain the abundance of PBH. If PBH are as abundant as 1% of the total dark matter, the interference signal observed in detected gravitational waves here on Earth would be significantly different. Next in the list is to study by how much we can constrain this abundance as a function of the mass function of the PBH. Stay tunned …

Preprint to the science article

Did LIGO really see massive black holes?

gravitational-waves-18
GW being emitted by a pair of black holes

The LIGO (and now Virgo) experiment has opened a new window to explore one of the most mysterious objects in nature, black holes (BH). When two black holes merge, they create a cataclysmic event that sends waves through the fabric of space itself and can travel cosmic distances. This is similar to an earthquake shaking Earth. These waves are known as gravitational waves (GW) and until 2015 they were just pure speculation as no experiment was ever able to detect them. Despite the tremendous amount of energy released when two BH merge (a binary BH merger), these waves, or ripples in space-time, are incredibly difficult to observe. The distortion  that a binary BH merger  in a nearby galaxy induces in space-time is minuscule when it reaches Earth. So minuscule that LIGO need to measure tiny shifts in the relative position between two mirrors which are several orders of magnituude smaller than the size of the smallest atom. This is an incredible achievement.  LIGO’s first detections of GW have brought a few surprises though. And they started with a bang!

The first event detected by LIGO in 2015 was interpreted as a heavier than expected binary BH merging in a  closer than expected galaxy. Similar events have been observed since raising several questions. Are these events more common than previously thought? Why have not we see them farther away if they are stringer than expected?  The mass of the individual black holes forming the binary BH were inferred to be approximately 30 solar masses each. Note that I say inferred because these masses could not be measured directly. What LIGO can measure with relative high precision is what is known as the observed chirp mass.  The intrinsic chirp mass is some combination of the two masses of the binary black hole. If both masses are similar, the chirp mass is similar to those masses as well. If the two masses of the binary BH are very different, the chirp mass will be a value in between these two masses (but closer to the mass of the lightest component). The observed chirp mass is related with the intrinsic chirp mass by the factor (1+z) where z is the redshift of the binary BH. The redshift is a measure of the distance so more distant objects have a larger redshift (our redshift is zero).  In other words, what LIGO can measure with good precision is Mo=Mc*(1+z) where Mc is the intrinsic chirp mass. Mo determines the frequency at which the GW is oscillating, a number that LIGO can estimate quite well. For that very first event, LIGO found that Mo had to be approximately 30 solar masses and that the distance was relatively small, that is z was close to zero. Hence, the intrinsic chirp mass (and mass of the individual BH before they merged)  had to be also close to 30 solar masses. This came as a surprise since many predictions made years earlier anticipated that such high values for Mo should be very rare. In fact, what was expected was to find values for Mo between 7 and 15 solar masses. This was in part motivated by observations of X-ray binary stars in our Galaxy, for which  it is possible to estimate the mass of the BH. An X-ray binary is a pair of closely orbiting  objects where one is a star and the other one is either a neutron star or BH. In this article we consider only the BH case. Roughly speaking, by measuring the amount of light emitting by the gas (from the star) spiraling towards the BH one can measure the mass of the BH.  In our Galaxy, the mass of about a dozen BH has been measured using this technique. The results show that the BH masses are between ~7 and ~14 solar masses. So far, no BH with a mass higher than 20 solar masses has been found in our Galaxy raising another, even more fundamental, question. Is our Galaxy special or is there something else we are missing regarding the BH masses and distances inferred by LIGO?

This is the question we address in our latest work. Owing to the degeneracy with the redshift described above, would it be possible that the intrinsic chirp mass was smaller if the redshift was higher? If the redshift is, let’s say z=1, instead of z~0 then, the intrinsic chirp mass could be a factor of two times smaller than the value inferred by LIGO (while keeping the observed chirp mass constant) bringing it into agreement with the BH masses observed in our Galaxy. There is one caveat though.  If the GW was originated in a galaxy far away at redshift z=1, instead of in a galaxy nearby (z~0), the intensity of the GW would have been much smaller than what LIGO observed. The intensity of the GW  is (to first order) the quantity that is used by LIGO to determine the distance. The observed intensity determined then that the inferred distance had to be relatively small. A mere few hundred Mpc instead of several thousand Mpc which would be the distance for a galaxy at redshftz~1 so one would conclude that the GW originated in a nearby galaxy and consequently, the intrinsic chirp mass had to be high. But this is the funny thing. Nature has interesting ways of playing with us. One of these ways is gravitational lensing thanks to which, an object that is far away may appear to us as if it were much closer (that is, it can amplify the intensity mentioned above). Note that I used the expression infer again when referring to the distance estimation by LIGO. This estimation is made under the assumption that gravitational lensing is not intervening. This is normally a good assumption since, after all, only a very small fraction of distant objects get (significantly) affected  by gravitational lensing. To be more precise, 1 in approximately 1000 or 10000 objects at redshifts larger than z=1 are substantially magnified by the gravitational lensing effect. Hence, is it still possible that a significant fraction of the LIGO events are distant lower mass events that are being magnified by gravitational lensing?  In our work we find that lensing can just make the trick. At large distances, the volume of the universe that is reaching us now (and by this I mean the volume where the light or GW we see now originated) is much  larger  than the corresponding volume at much smaller distances. To visualize this, imagine the volume of a shell of radius R. This volume goes like the square of the radius. So a large shell with a radius 10 times larger than a smaller shell will have 100 times the volume (if they both have the same thickness).  By precise calculations of the gravitational lensing effect over distant gravitational waves we prove that the massive and nearby events found by LIGO can in fact be interpreted as normal but more distant events with masses comparable to the ones found in our Galaxy. This solves the puzzle mentioned at the beginning of this article. Is our Galaxy special? And if it is not, where are the masses that LIGO claims is finding in nearby galaxies? The answer is that those masses would be the same in our Galaxy and in other galaxies. What is wrong is the interpretation of the observation since the amplification due to lensing has been ignored (this story is very similar to the puzzling first bright galaxies detected by Herschel that turned out to be all gravitationally lensed distant galaxies) .

So why has not anybody realized this earlier? That is a good question and the answer is not because people have not thought about this before. For our model to work, there is one little thing that sets our study apart from other similar attempts. As we mentioned earlier, at z~1, only one in a few thousand events could be magnified substantially by gravitational lensing. On the other hand, by observing more distant objects one is observing a larger volume, so one is observing more events. The gain in volume with respect to nearby distances is in the range of two orders of magnitude (more precisely about 1.5 orders of magnitude between z=0.1 and z=1 for a shell of thickness dz=0.1). This gain in volume is not enough to compensate the small probability of lensing at z~1 (1/1000 or less). A significant rate of lensed  events (enough to explain the rate of observed events)  can be obtained ONLY IF (and this is the little thing)  one increases the rate of intrinsic mergers at z=1 with respect to the rate at z=0. Such evolution in the intrinsic rate is expected and has been considered in the past. Our study shows that in order for the lensing mechanism to work and be able to explain the LIGO observations (with the troubling masses), the rate at z=1 needs to be more extreme than previously considered. This is not necessarily a problem since we simply don’t know what this rate is and also there are models that predict such rapid evolution in the intrinsic rate of events between z=1 and z=0 but, surprisingly, this type of strong evolution models were not considered in the past so the role played by lensing  was not recognized.

So then. Are we right? Are we wrong? Time will tell. After all, only one (if at all) of the many interpretations proposed to explain the LIGO massive events will be the correct one. An important aspect of any model is that it needs to be testable and this one is. If lensing is the culprit, at high magnifications one would expect a pair of images with similar magnifications and with a small time delay between them (hours to days depending on the lens mass, lens distance and relative source-lens-observer position).  LIGO detections don’t come in pairs (at least no such detections have been reported yet). If the time delay is several hours or days, it is possible that one of the two lensed events falls below the detection threshold of LIGO since the visibility (determined in part by the geometric factor in LIGO, a technicality whose explanation is beyond the scope of this article) may have changed substantially.  For simplicity, we can say that an event that is directly overhead the detector results in a significantly stronger signal-to-noise ratio than the same event near the horizon. Since Earth rotates once every 24 hours, a position in the sky (like the Sun for instance) can move from the zenith to the horizon in six hours. Hence, two identical GW originating in the same spot in the sky may have significantly different signal-to-noise if they are separated by approximately six hours. There is however a limit for how many times you may get the unlucky configuration that permits to hide one of the two images. Eventually two events should be observed that have virtually the same observed chirp mass and a distance estimate that is consistent with the uncertainties introduced by the geometric factor. The ratio of signal-to-noise between the two events should be compatible with the angle rotated by Earth during the time separation between the two events. Finally, the inferred location in the sky (derived from the time difference between detections in different observatories) should be also consistent with being the same for both events. Data mining of the LIGO data may unveil some of these missing events in the near future and confirm the lensing nature of the massive LIGO events.

Link to the publication

You can download the paper with our study in this link