Did LIGO really see massive black holes?

gravitational-waves-18
GW being emitted by a pair of black holes

The LIGO (and now Virgo) experiment has opened a new window to explore one of the most mysterious objects in nature, black holes (BH). When two black holes merge, they create a cataclysmic event that sends waves through the fabric of space itself and can travel cosmic distances. This is similar to an earthquake shaking Earth. These waves are known as gravitational waves (GW) and until 2015 they were just pure speculation as no experiment was ever able to detect them. Despite the tremendous amount of energy released when two BH merge (a binary BH merger), these waves, or ripples in space-time, are incredibly difficult to observe. The distortion  that a binary BH merger  in a nearby galaxy induces in space-time is minuscule when it reaches Earth. So minuscule that LIGO need to measure tiny shifts in the relative position between two mirrors which are several orders of magnituude smaller than the size of the smallest atom. This is an incredible achievement.  LIGO’s first detections of GW have brought a few surprises though. And they started with a bang!

The first event detected by LIGO in 2015 was interpreted as a heavier than expected binary BH merging in a  closer than expected galaxy. Similar events have been observed since raising several questions. Are these events more common than previously thought? Why have not we see them farther away if they are stringer than expected?  The mass of the individual black holes forming the binary BH were inferred to be approximately 30 solar masses each. Note that I say inferred because these masses could not be measured directly. What LIGO can measure with relative high precision is what is known as the observed chirp mass.  The intrinsic chirp mass is some combination of the two masses of the binary black hole. If both masses are similar, the chirp mass is similar to those masses as well. If the two masses of the binary BH are very different, the chirp mass will be a value in between these two masses (but closer to the mass of the lightest component). The observed chirp mass is related with the intrinsic chirp mass by the factor (1+z) where z is the redshift of the binary BH. The redshift is a measure of the distance so more distant objects have a larger redshift (our redshift is zero).  In other words, what LIGO can measure with good precision is Mo=Mc*(1+z) where Mc is the intrinsic chirp mass. Mo determines the frequency at which the GW is oscillating, a number that LIGO can estimate quite well. For that very first event, LIGO found that Mo had to be approximately 30 solar masses and that the distance was relatively small, that is z was close to zero. Hence, the intrinsic chirp mass (and mass of the individual BH before they merged)  had to be also close to 30 solar masses. This came as a surprise since many predictions made years earlier anticipated that such high values for Mo should be very rare. In fact, what was expected was to find values for Mo between 7 and 15 solar masses. This was in part motivated by observations of X-ray binary stars in our Galaxy, for which  it is possible to estimate the mass of the BH. An X-ray binary is a pair of closely orbiting  objects where one is a star and the other one is either a neutron star or BH. In this article we consider only the BH case. Roughly speaking, by measuring the amount of light emitting by the gas (from the star) spiraling towards the BH one can measure the mass of the BH.  In our Galaxy, the mass of about a dozen BH has been measured using this technique. The results show that the BH masses are between ~7 and ~14 solar masses. So far, no BH with a mass higher than 20 solar masses has been found in our Galaxy raising another, even more fundamental, question. Is our Galaxy special or is there something else we are missing regarding the BH masses and distances inferred by LIGO?

This is the question we address in our latest work. Owing to the degeneracy with the redshift described above, would it be possible that the intrinsic chirp mass was smaller if the redshift was higher? If the redshift is, let’s say z=1, instead of z~0 then, the intrinsic chirp mass could be a factor of two times smaller than the value inferred by LIGO (while keeping the observed chirp mass constant) bringing it into agreement with the BH masses observed in our Galaxy. There is one caveat though.  If the GW was originated in a galaxy far away at redshift z=1, instead of in a galaxy nearby (z~0), the intensity of the GW would have been much smaller than what LIGO observed. The intensity of the GW  is (to first order) the quantity that is used by LIGO to determine the distance. The observed intensity determined then that the inferred distance had to be relatively small. A mere few hundred Mpc instead of several thousand Mpc which would be the distance for a galaxy at redshftz~1 so one would conclude that the GW originated in a nearby galaxy and consequently, the intrinsic chirp mass had to be high. But this is the funny thing. Nature has interesting ways of playing with us. One of these ways is gravitational lensing thanks to which, an object that is far away may appear to us as if it were much closer (that is, it can amplify the intensity mentioned above). Note that I used the expression infer again when referring to the distance estimation by LIGO. This estimation is made under the assumption that gravitational lensing is not intervening. This is normally a good assumption since, after all, only a very small fraction of distant objects get (significantly) affected  by gravitational lensing. To be more precise, 1 in approximately 1000 or 10000 objects at redshifts larger than z=1 are substantially magnified by the gravitational lensing effect. Hence, is it still possible that a significant fraction of the LIGO events are distant lower mass events that are being magnified by gravitational lensing?  In our work we find that lensing can just make the trick. At large distances, the volume of the universe that is reaching us now (and by this I mean the volume where the light or GW we see now originated) is much  larger  than the corresponding volume at much smaller distances. To visualize this, imagine the volume of a shell of radius R. This volume goes like the square of the radius. So a large shell with a radius 10 times larger than a smaller shell will have 100 times the volume (if they both have the same thickness).  By precise calculations of the gravitational lensing effect over distant gravitational waves we prove that the massive and nearby events found by LIGO can in fact be interpreted as normal but more distant events with masses comparable to the ones found in our Galaxy. This solves the puzzle mentioned at the beginning of this article. Is our Galaxy special? And if it is not, where are the masses that LIGO claims is finding in nearby galaxies? The answer is that those masses would be the same in our Galaxy and in other galaxies. What is wrong is the interpretation of the observation since the amplification due to lensing has been ignored (this story is very similar to the puzzling first bright galaxies detected by Herschel that turned out to be all gravitationally lensed distant galaxies) .

So why has not anybody realized this earlier? That is a good question and the answer is not because people have not thought about this before. For our model to work, there is one little thing that sets our study apart from other similar attempts. As we mentioned earlier, at z~1, only one in a few thousand events could be magnified substantially by gravitational lensing. On the other hand, by observing more distant objects one is observing a larger volume, so one is observing more events. The gain in volume with respect to nearby distances is in the range of two orders of magnitude (more precisely about 1.5 orders of magnitude between z=0.1 and z=1 for a shell of thickness dz=0.1). This gain in volume is not enough to compensate the small probability of lensing at z~1 (1/1000 or less). A significant rate of lensed  events (enough to explain the rate of observed events)  can be obtained ONLY IF (and this is the little thing)  one increases the rate of intrinsic mergers at z=1 with respect to the rate at z=0. Such evolution in the intrinsic rate is expected and has been considered in the past. Our study shows that in order for the lensing mechanism to work and be able to explain the LIGO observations (with the troubling masses), the rate at z=1 needs to be more extreme than previously considered. This is not necessarily a problem since we simply don’t know what this rate is and also there are models that predict such rapid evolution in the intrinsic rate of events between z=1 and z=0 but, surprisingly, this type of strong evolution models were not considered in the past so the role played by lensing  was not recognized.

So then. Are we right? Are we wrong? Time will tell. After all, only one (if at all) of the many interpretations proposed to explain the LIGO massive events will be the correct one. An important aspect of any model is that it needs to be testable and this one is. If lensing is the culprit, at high magnifications one would expect a pair of images with similar magnifications and with a small time delay between them (hours to days depending on the lens mass, lens distance and relative source-lens-observer position).  LIGO detections don’t come in pairs (at least no such detections have been reported yet). If the time delay is several hours or days, it is possible that one of the two lensed events falls below the detection threshold of LIGO since the visibility (determined in part by the geometric factor in LIGO, a technicality whose explanation is beyond the scope of this article) may have changed substantially.  For simplicity, we can say that an event that is directly overhead the detector results in a significantly stronger signal-to-noise ratio than the same event near the horizon. Since Earth rotates once every 24 hours, a position in the sky (like the Sun for instance) can move from the zenith to the horizon in six hours. Hence, two identical GW originating in the same spot in the sky may have significantly different signal-to-noise if they are separated by approximately six hours. There is however a limit for how many times you may get the unlucky configuration that permits to hide one of the two images. Eventually two events should be observed that have virtually the same observed chirp mass and a distance estimate that is consistent with the uncertainties introduced by the geometric factor. The ratio of signal-to-noise between the two events should be compatible with the angle rotated by Earth during the time separation between the two events. Finally, the inferred location in the sky (derived from the time difference between detections in different observatories) should be also consistent with being the same for both events. Data mining of the LIGO data may unveil some of these missing events in the near future and confirm the lensing nature of the massive LIGO events.

Link to the publication

You can download the paper with our study in this link

 

 

 

 

 

 

Advertisements

Dark Matter under the microscope

Caustic_ColorDark matter remains one of the main unsolved problems in modern physics. Despite the growing evidence for its existence coming from astronomical observations, all efforts to detect it in a lab on Earth have failed. One possible candidate for dark matter that can not be detected on Earth (and let’s hope it stays like that) are primordial black holes (or PBH). This type of black hole was created during the first moments of the universe and may have survived till today. PBH are invisible (they don’t emit light, or extremely low amounts  if they are not very massive) and pretty much interact with the rest of the universe only through gravitational forces. This is basically the same behaviour as dark matter. Most types of PBH have been already ruled out but they can still exist in certain mass ranges (also, high spin PBH may not have been considered in detail in previous studies and may be harder to exclude). One of these possible  mass range is about 30 solar masses (think LIGO) and the second one is around the mass of a brown dwarf or a planet. A new type of observation may be able to prove these masses and rule out the possibility that PBH could be a sizeable fraction of the dark matter. This observation relies on caustic crossing events like the Icarus and Iapyx events observed in the galaxy cluster MACS1149. The interpretation of these events is that a very distant and luminous background star (z=1.55) is moving in a region that lies very close to  one of the caustics of the cluster (a caustic is a position which results in a large fraction of the light emitted from the star being focused to us at the focal point of the gravitational lens). As it moves, the light of the star gets amplified by the effect known as gravitational lensing. In its path to us, this light passes near stars (microlenses) in the galaxy cluster and the magnifcation changes depending on the distance to  the microlenses. Caustics are normally assumed to be smooth curves. In the presence of microlenses, caustics are disrupted like in the figure accompanying this post that shows a caustic being blown up by many PBH, each with 30 times the mas of the sun (without the PBH the caustic would resemble a single straight line instead of the web shown in the figure). We have studied this new type of observations and shown that through continuous monitoring of caustic crossing events it is possible to constrain the fraction of dark matter in the form of microlenses. So far, preliminary results do not favour an scenario where even a modest fraction of the dark matter ( a few percent) can be made of massive PBH (~ 30 solar masses).

You can read the scientific papers in the links below.

Observation paper

Theory paper

 

Seen stars in motion

A wise man said once that ; “A picture is worth a thousand words“. The wiser man replied, “A movie is worth a thousand pictures“. The movies below show a few examples on how the flux of the background star would change as the star moves across the field of microcaustics in the cases where only stars (and remnants) in the cluster act as microlenses and in the case where 1% of the dark matter is in the form of PBH with 30 solar masses each. For the first four movies the star is made unrealistically large in order to better see the effect (R=70000 solar radii). The magnification does not show large fluctuations as a consequence of this extreme radius.

Video 1) Icarus event with ICL stars

Video 2) Iapyx event with ICL stars

Video 3) Icarus event with ICL stars and 1% dark matter as PBH

Video 4) Iapyx event with ICL stars and 1% dark matter as PBH

An even higher resolution of the effect can be found in the two videos below where the resolution is increased by a factor ~30 and a more realistic star with 1000 Rsun is considered star (this is a typical radius for a giant star) . The first movie considers the more likely scenario where the direction of motion of the star with respect to the cluster caustic is at an angle. The movie considers an angle of 30 degrees but the result would be very similar at any angle larger than few degrees. The second case considers the special case (unlikely) where the motion of the star is aligned almost perfectly with the direction of the cluster caustic. In this case the star approaches the caustics through the cusps of the caustics producing a different pattern in the magnification. The caustic map is shown in the right panel of the movie with the position of the background star shown as a cross. For these movies we only consider microlenses from the intracluster medium (i.e, no PBH) and the central microlens has a mass of one solar mass.

Video 5) Star travelling at an angle with the caustic.

Video 6) Star travelling parallel to the caustic.

Similar movies but with just one microlens can be found in the two links below.In tehse movies, three nearby background stars cross the same caustic from a single microlens having M=1 Msun. The movies show how the same microlens can produce very different magnification patterns depending on the trajectory of the background star.

Video 7) Zoom in on Icarus side. Three stars travelling at an angle with a single microcautic from a microlens with M=1Msun

Video 8) Zoom in on Iapyx side. Three stars travelling at an angle with a single microcautic from a microlens with M=1Msun

 

You can read the scientific papers in the links below.

Observation paper

Theory paper

 

 

 

 

 

 

Where did the dark matter go?

Galaxias are supposed to be made mostly of stars, gas, dust and  … dark matter. Dark matter is the most mysterious substance in the Universe. We know HOW MUCH there is, we know WHERE it is, but we don’t know WHAT is it.  Dark matter has eluded detection by multiple experiments here on earth yet we continue gathering evidence of its existence on cosmological scales. Dark matter is responsible for the accelerated rotation of galaxies in the outer regions of these galaxies. It is also reponsible for the distribution of matter in the Universe on very large scales and also it is largely responsible for the so called gravitational lensing effect.  In galaxies, most of the mass is in the form of dark matter as evidenced by the rotational curves of galaxies that tell us ho much mass is inside a given distance from the centre of the galaxy. In some heavy galaxies, the amount of dark matter is so large that the space warps around these galaxies producing the optical illusion that another galaxy that is far behind these galaxies is seen in two, sometimes in three or more locations. This is the gravitational lensing effect predicted by the  general theory of relativity , thanks to which we can study the distribution of dark matter in these galaxies.

a370_2bcgs

The most spectacular examples of the gravitational lensing effect can be found in galaxy clusters where the concentration of dark matter is greatest. Among the observations of the gravitational lensing effect, the recent Hubble Frontier Program is producing the best data around colliding galaxy clusters offering a unique opportunity to study dark matter (in a manner that tries to emulate particle accelerators that smash particles against each other).  In a recent work we use the gravitational lensing effect around a special type of galaxies, known as BCG (or Brightest Cluster Galaxy) and find something unexpected. Our results show that these particular galaxies have no dark matter at all (or a very small amount). Our study relies on two gravitationally lensed galaxies (marked with 7.2, 7.3, 19.1 and 19.2 in the figure above) that can be explained only if the two BCG galaxies in these cluster have most of their mass (if not all) in the form of stars and no dark matter (or very little). If these galaxies were normal galaxies, that is, having a significant amoount of dark matter , the two images shown above would be curved  towards the BCG galaxies, something that can be ruled by the observations. If confirmed, our findings would require new developments on galaxy formation  to explain this type of galaxies. We discuss several scenarios that could result on this type of galaxies. One of the most promising ones and that has not been explored sufficiently in the past is that these galaxies form as a consequence of cooling and not as the result of merging of several smaller galaxies. Future observations could confirm or reject this hypothesis but that will be a different story …

Link to paper: http://arxiv.org/abs/1609.04822

 

 

 

 

Gravitational waves (II)

LIGO has announced the detection of gravitational waves.

This is a remarkable achievement made possible after steady technological progress on detector technology. The detection of gravitational waves is relevant for different reasons. First, it confirms one of the main predictions made by General Relativity, that space itself can be shaped and dragged by massive objetcs such as black holes and that ripples in the space-time can be produced by such moving objects and move at the speed of light. This idea, that gravity “moves” at the speed of light and is not instantaneous like in Newtonian physics,  is what got Einstein in the first place to develop an alternative theory to the classical (Newtonian) gravitational theory. If gravity travels at the speed of light, its natural to think of it as a wave, similar to photons.  Although the curvature of space was long confirmed by observations of gravitational lensing, and the influence of massive bodies over time has been also confirmed (and applied to current technology like the GPS) the gravitational wave prediction remained elusive form the experimental point of view. Indirect confirmations was provided nearly half a century ago by the slowing down of the periods of a pair of orbiting neutron stars.

Web_NASA_binary star merger__gravitational_waves
Two stellar objects sipinning around each other and “radiating” gravitational waves. The two-black hole picture would be similar except we could not see the black hoes.

 

The discovery of gravitational waves is important also because ot opens a new window for research, not only of cataclismic events like the collision of two massive black holes but also for studying the origin of the Universe. A different type of gravitational waves (the primordial type), created right after the formation of our Universe are expected to be detected in the near future. A year ago, a claim was made about their detection but it turned out to be a false alarm. The technology to detects this primordial gravitational waves is however advancing at great speed and is just a question of time (1-5 years) till we can see the primordial gravitational waves. Once detected, they will give us valuable information about new phenomena, such as inflation, responsible for the structure of the Universe that we see today.

Solving a long standing mystery

Planck helps solve a long standing mystery

Virgo cluster (marked with a big circle) is near the centre of the image. The signal detected by Planck extends well beyond the limits of the cluster probing part of the missing baryons.

One of the puzzles of modern astronomy is what is known as the missing baryon problem. Baryons are the ordinary matter we are familiar with. You are made of baryons as it is everything you touch, eat and see. The best known form of baryons are electrons and protons. Together with neutrons (another form of baryons) they form atoms and atoms form molecules and molecules form … well, everything else. Detailed observations of the distant Universe tell us how many baryons are out there and the amount we can see agrees very well with what is expected from the standard model that describes the Universe so there is nothing surprising there. The story changes when we look at the Universe but at distances much closer to us. In theory, we should see the same amount or proportion of baryons that we see in the distant Universe right here, in our neighborhood but they are no where to be found, so where are they?

Baryons follow a similar law than energy, they don’t get created nor destroyed (for the most part), they transform  (with the transformation between a neutron and an electron plus a proton or viceversa being a classical example). If there were baryons in the early universe, pretty much the same number of baryons should exist today. Instead, observations of the local Universe reveal a significant deficit of baryons when compared with the expectations and the observed number of varions in the most distant Universe.   It is commonly believed that most of these missing baryons are in the form of a plasma which emits very small amounts of light (mostly at high energies like UV or X-rays) which has not been detected so far. Howevere, the same plasma produces also a distortion in the light that originated soon after the Big Bang (more rpecisely, 300000 years after the Big Bang). This light, known as the Cosmic Microwave Background, or CMB,  has been travelling through the Universe since the time it was first produced and permeates the entire Universe. When the CMB light crosses a region filled with plasma, it gains a small amount of energy. This small gain of energy can be measured with current telescopes like the Planck satellite through an effcet known as the Sunyaev-Zel’dovich, or SZ ,effect. The SZ effect has been studied with Planck in dense and hot plasma regions, usually found at the centre of galaxy clusters. In a recent work, we have focused our attention to one particular cluster, the Virgo galaxy cluster. This cluster is special because it is the closest cluster to us. In fact, it is so close that  our galaxy is falling towards the centre of this cluster due to its ginat gravitational attraction. The distance from our galaxy to the centre of Virgo is only about six times  the distance from our galaxy to our closest sister galaxy, the Andromeda galaxy. The apparent size of Virgo in the sky is about 15 times larger than the apparent size of the full moon. This large size, allowed us to do a detailed statistical analysis that takes advantage of the large size of Virgo and maximizes the small distortion that the missing baryons around Virgo produce over the CMB light.  Our findings (summarized in the figure accompanying this post) reveal vast amounts of plasma beyond the previously established limits of the Virgo cluster. The signal around Virgo observed by Planck coincides with what was the expected signal emerging from the missing baryons around galaxy clusters confirming that the missing baryons are probably forming diffuse clouds of plasma around the biggest structures in the Universe, like galaxy clusters. Although the missing baryons found by Planck don’t account for all the missing baryons, it does reduce the amount of baryons that are still evading a firm detection. Future analyses based on Planck and ground-based experiments will continue  the hunt for the few remaining missing baryons …

 

The paper with all the details and results can be found in the following link :  http://arxiv.org/abs/1511.05156

 

 

 

Hubble Frontier Fields Program extended (and time travel)

SN_predictionhe Hubble Frontier Fields (HFF) program has been recently extended to include two additional clusters to this spectacular data set. This is great news for science. To date, the HFF has provided the best data set to study the distribution of dark matter in galaxy clusters (the same data set is used for other exciting projects). Currently, the HFF is covering two clusters (from the original set of 4). One of them is MACSJ1149.5+2223. This cluster is interesting for several reasons. One of them is the fact that a supernova at z=1.491  was observed by Hubble in one of the observing campaings. This supernova is seen 4 times, in a configuration known as Einstein cross. The multiple images observed by Hubble are distorted versions of the original supernova that are multiply lensed 4 times (gravitational lensing). One interesting feature of gravitational lensing is that since the paths of photons are distorted by the gravitational potential, and so is their time of arrival to our telescopes on earth. Because of this time difference, multiply lensed images of the same background object are seeing in different epochs. Is like seeing your kids simultaneously when they had different ages, … weird. The supernova observed by Hubble is observed (4 times) in one of the arms of its host galaxy in just one of the counterimages but not in the other two counterimages. This means that that supernova will be observed in the future in the other two counterimages or it has already happened in those two counterimages. Since the lifetime of a supernova is short (days or weeks) and there is no way to predict when a star will go supernova it is nearly imposible to observe a star going supernova before it happens.  The multiply lensed images  of the supernova in this cluster, could in theory, allow us to predcit when a supernova is going to be observed and study the supernova explosion from the very beginning. Using accurate models of the gravitational potential we where able to predict the time difference between the different counterimages and predict when the supernova will be observed next in the different counterimages or when it was happening in those images. The figure shows the predicted time delays for the supernova observed in MACS1149. This supernova is observed now four times. Our model predicts that we are too late to observe one of the counterimages that occurred about 9 years ago but a new chance to see this supernova will take place again around November 1st 2015.  This will be the first time we can point a telescope to a position and wait for the SN to happen (again). Talk about time travel !

Original paper: http://arxiv.org/abs/1504.05953

Published version: http://mnras.oxfordjournals.org/content/456/1/356

SN REFSDAL UPDATE (Dec. 2015)

On December 12th 2015, news broke about the reappearance of SN Refsdal at the exact predicted position posted in this article. The date of the explosion is uncertain by one month but it must have happened between November 15th and December 10th which are the dates when Hubble was observing at this location. On the November 15th observation there was no sign of the explosion but in the December 10th observation the SN had already shown up at the predicted position. The date of the original prediction (November 1st) is based on a value of the Hubble constant that is a bit out of date (h=0.7). Adopting a more recent estimate (h=0.67) and re-scaling the time delay, the best prediction for the reappearance shifts from November 1st 2015 to  November 17th, right in the window of time where we know the explosion had happened.

 

 

 

21st century is here

INdiaMarsMissionWhen was the first time you noticed you where in the 21st century? For some of you it may have been when you owned your first smartphone, or your first electric car, or your fancy 3D TV or 3D printer, or maybe when you tried the google glass? Those are cool little things that may impress you for a while but hardly they’ll make it into the history books. Now really, when was the first time you had a sense that things had already changed? For me that day was this week when I saw this picture. It does not just tell a story, it shows the change that will dominate the 21st century.  The Indian Space Research Organisation succesfully put in orbit a small satellite around Mars. The picture shows a group of female scientist/engineers celebrating this tremendous achievement.  This picture illustrates the game-changing rules of the 21st century. First, the definite raise of Asia as a superpower in the world. Space missions are normally used by governments as powerful messages to the world. Very few things have the power to bring the attention of the world (in a good way) as a successful space mission. One of the reassons why governments choose space missions to demonstrate their power (economic, technological, militar and even political) is that you can not cheat in space. You either have the will and technology to be succesfull  or you don’t. India, together with its Asian neaighbours,  is destined to play a leading and fundamental role in the 21st century. But even more imporant, is the second conclusion that we can get form the picture above. What comes to your mind when you think of a rocket scientist? Well, think again. Women also, are destined to play a fundamental and leading role in the 21st century. The 20th century was full of promisses for women, some of them only partially fulfilled. Now reality is here. If one thing will define the 21st century it won’t be the defnite raise of Asia, but the definite raise of women.